Augmented Reality (AR)
Liveposter supports Augmented Reality (AR) experiences that overlay 3D models, videos, and images on physical posters when viewed through a mobile device camera.
Overview
Section titled “Overview”Liveposter’s AR system lets you overlay 3D models, videos, and images on physical posters when viewed through a mobile camera. The system uses:
- MindAR - Image tracking library for recognizing physical posters
- A-Frame - 3D/VR framework for rendering AR content
- WebXR - Works in modern mobile browsers without app installation
How it works:
- You add an
arproperty to your poster JSON configuration - MindAR uses
.mindtarget files to recognize your physical poster images - When recognized, A-Frame renders your AR layers (3D models, videos, images) on top
- Users scan physical posters with their mobile browser - no app needed!
JSON Configuration Format
Section titled “JSON Configuration Format”Add an ar object to your poster spec to enable AR:
{ "mode": "diaporama", "images": [{ "src": "poster.jpg" }],
"ar": { "enabled": true, "targetFiles": ["poster.mind"], "layers": [ { "targetIndex": 0, "type": "model", "src": "model.gltf", "position": { "x": 0, "y": 0, "z": 0 }, "scale": { "x": 1, "y": 1, "z": 1 } } ] }}Configuration properties:
enabled(boolean) - Turn AR on/offtargetFiles(array of strings) - Paths to.mindfiles for image recognitionlayers(array of objects) - AR content to display (3D models, videos, images)- Each layer must reference a
targetIndex(which.mindfile to track) - Each layer has a
type:"model","video", or"image" - Position, rotation, and scale use A-Frame coordinate system
- Each layer must reference a
Quick Start
Section titled “Quick Start”1. Install AR Dependencies
Section titled “1. Install AR Dependencies”AR compilation requires additional dependencies. Install them only when you need AR features:
npm install @tensorflow/tfjs @msgpack/msgpack canvas mathjs ml-matrix svd-js tinyqueue @mediapipe/tasks-visionNote: These dependencies are large (~200+ MB) and only needed for compiling .mind target files. Regular Liveposter animations work without them. This keeps the base package lightweight while allowing AR users to opt-in when needed.
2. Create Target Files
Section titled “2. Create Target Files”Target files (.mind) contain visual feature data that MindAR uses to recognize your physical poster images.
Option A: Automatic Generation (Recommended)
Use the built-in CLI tool to automatically generate .mind files for all images in your poster specs:
# Generate targets for a single posternpx liveposter ar-compile-targets poster.json
# Generate targets for a poster listnpx liveposter ar-compile-targets poster-list.json
# Force regenerate existing targets (overwrites all .mind files)npx liveposter ar-compile-targets --force poster.json
# Preview what would be generated (doesn't compile)npx liveposter ar-compile-targets --dry-run poster.json
# Verbose output with detailed progressnpx liveposter ar-compile-targets -v poster.jsonWhat the tool does:
- Finds images: Scans all JPG, JPEG, PNG, GIF, BMP, and WEBP files in your specs
- Generates .mind files: Creates targets alongside original images (
image.jpg→image.mind) - Smart caching: Automatically skips images that already have
.mindfiles - Progress tracking: Shows
[22/40]counter and real-time status for each file - Ignores: Remote URLs (http/https) and video files are automatically skipped
Command options:
-f, --force- Overwrite existing .mind files (useful when source images change)-d, --dry-run- Preview what would be compiled without actually compiling-v, --verbose- Show detailed progress with total image count-h, --help- Display help message
Example output:
🎯 Starting AR target compilation...
[1/40] ⊘ Skipped: ./images/image-1.jpg (already exists) [2/40] ⏳ Processing: ./images/image-2.jpg... [2/40] ✓ Compiled: ./images/image-2.jpg → image-2.mind (849.4 KB) [22/40] ⏳ Processing: ./images/poster.jpg... 60% ...
======================================================================SUMMARY====================================================================== ✓ Compiled: 15 ⊘ Skipped: 25 (already exist) ⏱️ Time: 42.3s======================================================================Status indicators:
- ✓ Compiled - Successfully generated .mind file
- ⊘ Skipped - Already has .mind file (use
--forceto regenerate) - ⏳ Processing - Currently compiling (shows progress %)
- ❌ Failed - Compilation error (shows error message)
Option B: Manual Generation
- Visit the MindAR Image Compiler
- Upload your poster image (the physical image users will scan)
- Download the generated
.mindfile - Place it in your project directory alongside the image
Tips for good tracking:
- Use high-contrast images
- Include distinctive features (corners, edges, patterns)
- Avoid pure white or pure black images
- Minimum recommended size: 400x400 pixels
- Recommended: 480-1920px on longest side
How .mind file compilation works:
The target compilation process extracts and encodes visual features from your images:
-
Feature Detection (0-50% progress)
- Converts image to greyscale
- Creates multi-scale image pyramids (using scale factor 2^(1/3))
- Detects interest points (bright spots and dark spots) at each scale using TensorFlow.js
- Applies hierarchical clustering to organize features
-
Tracking Data Extraction (50-100% progress)
- Processes smaller resolution images (256px and 128px)
- Extracts optimized features for real-time AR tracking
- Uses CPU-based TensorFlow.js in Node.js
-
Binary Encoding
- Compresses data using MessagePack format
- Typical file size: 50-500 KB per image
- Much smaller than original images
- Contains feature points for matching and tracking (NOT pixel data)
-
Result
- A
.mindfile containing:- Image dimensions
- Feature points for image recognition (matching)
- Optimized features for real-time AR tracking
- The original image is still needed for display -
.mindfiles only contain abstract feature data
- A
Important: The .mind file is used BY MindAR during AR sessions to recognize your physical poster, but you still need the original image for users to scan. Think of it as a “feature map” rather than a copy of the image.
Learn more about .mind files:
- MindAR Image Compiler Tool - Web-based compiler with detailed explanations
- MindAR Image Tracking Guide - Understanding the tracking system
- MindAR Best Practices - Tips for optimal AR experiences
2. Add AR Configuration
Section titled “2. Add AR Configuration”Add an ar property to your poster spec:
{ "mode": "diaporama", "timing": { "duration": 3000 }, "images": [{ "src": "poster.jpg" }],
"ar": { "enabled": true, "targetFiles": ["poster.mind"], "layers": [ { "targetIndex": 0, "type": "model", "src": "bear.gltf", "position": { "x": 0, "y": -0.25, "z": 0 }, "scale": { "x": 0.05, "y": 0.05, "z": 0.05 }, "animation": true } ] }}3. Test in Development
Section titled “3. Test in Development”npx liveposter ar-dev my-poster.jsonOpen the URL on your mobile device and point the camera at your poster image.
4. Build for Production
Section titled “4. Build for Production”npx liveposter ar-build my-poster.json --output dist-arDeploy the dist-ar folder to any static hosting (Vercel, Netlify, etc.).
AR Layer Types
Section titled “AR Layer Types”3D Models
Section titled “3D Models”Display GLTF/GLB 3D models:
{ "targetIndex": 0, "type": "model", "src": "model.gltf", "position": { "x": 0, "y": 0, "z": 0 }, "rotation": { "x": 0, "y": 0, "z": 0 }, "scale": { "x": 1, "y": 1, "z": 1 }, "animation": true, "opacity": 1}Supported formats: GLTF (.gltf, .glb)
Videos
Section titled “Videos”Overlay video content:
{ "targetIndex": 0, "type": "video", "src": "video.mp4", "position": { "x": 0, "y": 0, "z": 0 }, "scale": { "x": 1, "y": 1, "z": 1 }, "loop": true, "autoplay": true, "opacity": 1}Supported formats: MP4, WebM, OGG
Multi-format support:
{ "src": ["video.mp4", "video.webm"], "type": "video"}Images
Section titled “Images”Display 2D images:
{ "targetIndex": 0, "type": "image", "src": "overlay.png", "position": { "x": 0, "y": 0, "z": 0 }, "scale": { "x": 1, "y": 1, "z": 1 }, "opacity": 1}Best format: PNG (supports transparency)
Position, Rotation, Scale
Section titled “Position, Rotation, Scale”Liveposter uses the A-Frame 3D coordinate system for positioning AR content. All layers support position, rotation, and scale properties.
Position
Section titled “Position”Units are in meters relative to the target (poster) center:
x: Left (negative) to Right (positive)y: Down (negative) to Up (positive)z: Away from camera (negative) to Toward camera (positive)
{ "position": { "x": 0.5, "y": 0.2, "z": -0.1 } // 0.5m to the right, 0.2m up, 0.1m behind the poster}Rotation
Section titled “Rotation”Degrees around each axis:
x: Pitch (rotate around horizontal axis)y: Yaw (rotate around vertical axis)z: Roll (rotate around depth axis)
{ "rotation": { "x": 0, "y": 45, "z": 0 } // Rotated 45° around the vertical axis}Multipliers where 1 = original size:
{ "scale": { "x": 0.1, "y": 0.1, "z": 0.1 } // Scaled to 10% of original size}For more details on the 3D coordinate system, see:
Multiple Targets
Section titled “Multiple Targets”Track multiple posters simultaneously:
{ "ar": { "enabled": true, "targetFiles": [ "poster1.mind", "poster2.mind", "poster3.mind" ], "layers": [ { "targetIndex": 0, "type": "model", "src": "bear.gltf" }, { "targetIndex": 1, "type": "model", "src": "raccoon.gltf" }, { "targetIndex": 2, "type": "video", "src": "video.mp4" } ] }}Poster Lists with AR
Section titled “Poster Lists with AR”Build AR experiences from multiple posters:
{ "version": "0.0.1", "title": "AR Poster Collection", "description": "Multiple AR-enabled posters", "posters": [ { "id": "poster-1", "configPath": "./posters/poster1.json", "enabled": true }, { "id": "poster-2", "configPath": "./posters/poster2.json", "enabled": true } ]}Build the merged AR experience:
npx liveposter ar-build poster-list.json --output dist-arHow it works:
- All
.mindfiles are combined into a single array - Layer
targetIndexvalues are automatically adjusted to avoid conflicts - Assets are namespaced by poster ID
- Single AR scene detects all enabled posters
CLI Commands
Section titled “CLI Commands”Compile AR Targets
Section titled “Compile AR Targets”npx liveposter ar-compile-targets [options] <input>Automatically generates .mind target files for all images in your poster specs.
Options:
-v, --verbose- Enable verbose logging with progress bars-f, --force- Overwrite existing .mind files-d, --dry-run- Preview what would be compiled without actually compiling-h, --help- Show help message
Examples:
# Generate targets for all images in a specnpx liveposter ar-compile-targets poster.json
# Generate for multiple posters in a listnpx liveposter ar-compile-targets poster-list.json
# Force regenerate (overwrites existing .mind files)npx liveposter ar-compile-targets --force poster.json
# See what would be generatednpx liveposter ar-compile-targets --dry-run poster-list.json
# Verbose output with progressnpx liveposter ar-compile-targets -v poster.jsonDevelopment Server
Section titled “Development Server”npx liveposter ar-dev <spec-file.json>Features:
- Hot-reload on file changes
- Watches spec file, assets, and target files
- Auto-rebuilds AR scene
- Runs on port 3100 by default (configurable with
PORT_AR)
Custom port:
PORT_AR=8080 npx liveposter ar-dev my-poster.jsonBuild for Production
Section titled “Build for Production”npx liveposter ar-build <input> [options]Options:
--output, -o <dir>- Output directory (default: dist-ar)--aframe-version <version>- A-Frame version (default: 1.6.0)--mindar-version <version>- MindAR version (default: 1.2.5)
Examples:
# Build single posternpx liveposter ar-build poster.json
# Custom output directorynpx liveposter ar-build poster.json --output my-ar-build
# Custom CDN versionsnpx liveposter ar-build poster.json --aframe-version 1.6.0Dual Server Setup
Section titled “Dual Server Setup”Run both demo and AR servers simultaneously:
# Option 1: Use npm scriptnpm run test:dual:dev
# Option 2: Manual (separate terminals)# Terminal 1 - Demo server (port 3000)npx liveposter poster-list.json
# Terminal 2 - AR server (port 3100)npx liveposter ar-dev poster-list.jsonWhy run both?
- View regular poster animations on desktop
- Test AR experience on mobile
- Same poster list, different output formats
Asset Optimization
Section titled “Asset Optimization”3D Models
Section titled “3D Models”- Use GLTF/GLB format (GLB is smaller, single-file)
- Optimize polygon count (< 10k triangles recommended)
- Compress textures (max 1024x1024 for mobile)
- Use Draco compression for GLB files
Videos
Section titled “Videos”- Use H.264/MP4 for best compatibility
- Keep resolution ≤ 1080p
- Optimize bitrate (2-4 Mbps recommended)
- Provide multiple formats (MP4, WebM) for fallback
Images
Section titled “Images”- Use PNG for transparency
- Keep dimensions reasonable (≤ 2048px)
- Optimize file size with tools like TinyPNG
Best Practices
Section titled “Best Practices”- Limit simultaneous targets: 3-5 targets maximum for good performance
- Reduce layer count: Keep layers per target under 5
- Use CDN for assets: Faster loading, better caching
- Test on real devices: Desktop simulators don’t reflect mobile performance
- Enable HTTPS: Required for camera access in production
Deployment
Section titled “Deployment”Static Hosting
Section titled “Static Hosting”The AR build output is a static HTML file:
# Buildnpx liveposter ar-build poster.json --output dist-ar
# Deploy to any static host# - Netlify: Drag dist-ar folder# - Vercel: Deploy dist-ar as root# - GitHub Pages: Push dist-ar to gh-pages branch# - AWS S3: Upload dist-ar contentsRequirements
Section titled “Requirements”- HTTPS required: Camera access requires secure context
- CORS headers: If loading assets from different domain
- Mobile-friendly: Viewport meta tag (included in template)
Troubleshooting
Section titled “Troubleshooting”Camera Not Triggering
Section titled “Camera Not Triggering”Problem: AR scene loads but camera doesn’t start Solutions:
- Ensure you’re using HTTPS (required in production)
- Check browser permissions for camera access
- Try a different browser (Chrome/Safari recommended)
Target Not Detected
Section titled “Target Not Detected”Problem: Camera works but poster isn’t recognized Solutions:
- Ensure good lighting conditions
- Hold camera steady at 30-50cm from poster
- Verify
.mindfile matches the physical poster - Use high-contrast target images
- Regenerate
.mindfile with better source image
Poor Tracking
Section titled “Poor Tracking”Problem: AR content jumps or disappears Solutions:
- Increase distance from poster
- Improve lighting conditions
- Use higher-quality target images
- Avoid reflective surfaces on poster
Performance Issues
Section titled “Performance Issues”Problem: Laggy or slow AR experience Solutions:
- Reduce model complexity
- Lower video resolution
- Decrease number of targets/layers
- Test on target devices (not just desktop)
Files Not Loading
Section titled “Files Not Loading”Problem: 404 errors for assets Solutions:
- Use relative paths in your JSON config
- Verify assets are copied to dist-ar during build
- Check browser console for specific errors
- Ensure static server is serving from correct directory
Examples
Section titled “Examples”See the AR examples in the repository:
packages/demo-server/public/ar-examples/ar-demo-1-model.json- Single 3D modelpackages/demo-server/public/ar-examples/ar-demo-2-multi-model.json- Multiple targetspackages/demo-server/public/ar-examples/ar-demo-3-mixed-layers.json- Mixed layer types
Browser Support
Section titled “Browser Support”Supported browsers:
- iOS Safari 11+
- Android Chrome 79+
- Modern mobile browsers with camera access
No app installation required - everything runs in the browser.
Resources
Section titled “Resources”- MindAR Documentation
- MindAR Image Compiler - Generate
.mindfiles - A-Frame Documentation
- GLTF Optimization Tools