How to Accelerate Quest Development with Agentic Workflows (Unity AI + MCP)

How to Accelerate Quest Development with Agentic Workflows (Unity AI + MCP)

Introduction to Agentic Workflows in VR Development

Welcome and Introduction

  • The session begins with a warm welcome, inviting viewers to confirm their audio connection.
  • Dilmer, a developer advocate at Meta, is introduced. Ashray shares a personal anecdote about learning from Dilmer's tutorial videos five years ago.
  • The focus of the lecture is on agentic workflows for VR development using AI and MCP (Meta Cloud Platform).

Overview of the Session

  • Dilmer expresses excitement about discussing new tools for AI in VR development that were previously mentioned at GDC.
  • He emphasizes that the goal is not to replace artistic roles but to enhance workflow efficiency for developers.

Understanding Agentic Workflows

Purpose of Agentic Workflows

  • The main idea behind agentic workflows is to improve productivity rather than just generating art or assets.
  • Dilmer aims to demonstrate how these tools can help developers become exponentially more productive without needing extensive resources.

Benefits of AI in Game Development

  • Many developers feel overwhelmed by the plethora of AI tools available but recognize their potential for simplifying game development processes.
  • AI can bridge gaps in resources and time, allowing content creators to build games more efficiently despite other commitments.

Agenda for Today's Discussion

Key Topics Covered

  • The agenda includes an exploration of agentic workflows in VR, Unity AI offerings, and Meta's Unity MCP extensions.
  • Updates on MCP and Horizon OS will be shared along with new CLI tools useful during hackathons.

Practical Application

  • Emphasis on utilizing these tools collaboratively within teams as they work on projects together.

Live Demonstration and Tools Overview

Live Demo Expectations

  • A live demonstration will showcase results from a project being built during the session, highlighting practical applications of discussed concepts.

Understanding MCP Capabilities

  • Agentic workflows are designed not to replace existing roles but to enable faster shipping of products while leveraging Unity’s capabilities alongside Cloud Code integration.
  • Multiple MCP options are available such as CoPlay and Unity AI assistant; these facilitate interaction with Unity scenes effectively.

Agentic Workflows and Development Tools

Benefits of Rapid Project Development

  • The ability to build projects in just three weeks significantly enhances efficiency, allowing developers to focus on building rather than getting bogged down by intricate coding details.
  • This rapid development process enables teams to ship products faster and maintain a productive workflow.

Introduction to HZDB and Command Line Tools

  • HZDB is a new tool that acts as a wrapper around ADB tools for Android, facilitating device connection and performance information retrieval via the command line.
  • It includes an MCP server feature that provides access to Meta documentation and VR components through command line execution.

Agentic Workflows in VR Development

  • The agentic tools repository announced at GDC offers various skills for creating WebXR applications without extensive documentation review.
  • An independent HZDB binary available on NPM supports continuous integration, making it suitable for larger teams with automated deployment needs.

Understanding Agentic Workflows

  • Agentic workflows allow developers to interact seamlessly with Unity using knowledge from the Meta SDK, simplifying the integration of VR features like hand tracking.
  • Historically complex processes have evolved into more user-friendly methods where developers can drag-and-drop components instead of manually integrating folders.

Future Developments: Meta AI Developer Assistant

  • A new Meta AI developer assistant is being developed to enhance interaction with MCPs through the Meta Quest developer hub or IDE with AZDB enabled.
  • This assistant will provide easy access to documentation and support within the development portal, streamlining the development process further.

Integration with Unity AI Assistant

  • CoPlay is an open-source project that facilitates agentic workflows in Unity; both free and paid versions are available for use.
  • The Unity AI assistant package integrates deeply with Unity, offering a chat window interface for real-time assistance during development tasks.

Practical Example: Changing Scene Lighting in Unity

  • The Unity AI Assistant allows users to change lighting conditions dynamically based on commands given through its chat interface.
  • Utilizing LLM capabilities, it can manage scene elements effectively while providing end-to-end control over UI creation.

Cloud Code and AI Integration in Unity

Overview of Cloud Code and AI Assistant

  • The integration of Cloud Code with Unity allows for the use of additional agents, enhancing development capabilities.
  • Developers can utilize various models like Cursor, Gemini, and Codex through the AI Assistant to streamline their workflow.
  • The AI Assistant facilitates rapid iteration by helping developers implement features without extensive manual coding.
  • Performance optimization is another key feature; the assistant aids in identifying performance bottlenecks within games.
  • The integration provides suggested optimizations based on analysis of frame rates and allocations.

Enhancements for VR Development

  • The runtime optimizer complements existing tools by offering additional support specifically for VR optimization tasks.
  • Editor tooling is crucial for creating custom tools in Unity, which can be time-consuming; AI tools can assist in this process.
  • Understanding Unity's internal workings is essential when implementing editor tools, as changes may affect functionality.
  • Asset store items can also be leveraged to create advanced custom tooling within Unity projects.

Features of Meta Unity MCP Extensions

  • Meta MCP extensions enhance understanding of core components like hand tracking and controller tracking in VR experiences.
  • These extensions require specific SDK installations (Meta XR Core SDK and Meta XR Interaction SDK).

Setting Up a VR Project with Unity

  • A basic demo illustrates how to prepare a VR project using the necessary components from the Meta MCP extensions.
  • Installing the Unity AI Gateway is essential for utilizing these extensions effectively within your project setup.
  • Developers must accept cloud connections when integrating with different IDE options like VS Code or Rider.

How to Add Unity MCP through Cloud Code

Simplifying Game Object Management

  • The process of adding Unity MCP via Cloud Code is streamlined, eliminating the previous complexity of navigating packages and prefabs.
  • Developers no longer need to manually associate scripts with game objects for components like camera rigs; this can now be done easily through Cloud Code.

Access Control in Unity

  • Users can manage access permissions for their applications, allowing them to execute commands and modify scenes while ensuring security by revoking unnecessary access.

Enhancing VR Project Components

  • The presentation discusses creating a VR project that includes essential features such as a camera rig and controller support but emphasizes the potential for further enhancements.
  • More advanced setups, like an interaction rig that supports locomotion and hand tracking, can be added using simple prompts.

Realistic Game Development Approach

  • When developing more complex game ideas, it's beneficial to control AI actions rather than relying on it to create everything at once.
  • A feature-based approach is recommended for scalability; starting with a minimum viable product (MVP) allows developers to iterate effectively.

Example Implementation: Bowling Game Mechanics

  • An example prompt was provided for implementing a Meta Quest hand tracking mechanic in a bowling game, showcasing how specific instructions lead to controllable code generation.
  • The generated code included elements like skyboxes and hand tracking functionality, enabling developers to review and refine their work before proceeding.

Shader Development and Game Mechanics in VR

Inspiration and Shader Implementation

  • The speaker discusses the addition of shaders to a project, inspired by the visual aesthetics of the Tron movie.
  • A futuristic glowing shader was requested for a ball, with color transitions; however, it initially worked only in Unity editor but not on Meta Quest.
  • The importance of an iterative approach is highlighted, allowing for testing features before full game development.
  • The speaker identifies issues with rendering in VR, emphasizing the need for specific shader parameters to ensure proper display in both eyes.
  • Additional features like glow color and intensity were added automatically by AI beyond initial requests, showcasing AI's capability to enhance designs.

Technical Adjustments for VR Compatibility

  • The shader required multi-compilation and instancing options to function correctly on stereo devices like Meta Quest.
  • The complexity of shader creation is acknowledged; however, leveraging AI allowed focus on C# programming while enhancing game polish.
  • A simple pull ball mechanic was developed as part of transitioning from a prototype to a full game experience.

Game Loop Development

  • The speaker aimed to create a gameplay loop similar to Beat Saber but using hand tracking instead of traditional sword mechanics.
  • Procedurally generated rings were introduced as objects within the game environment, minimizing asset creation efforts while maintaining a minimalistic UI style.
  • Results showed successful generation of procedural rings and an environment reflecting the desired Tron aesthetic.

Iterative Design Process

  • After initial iterations focused on basic mechanics, further refinements included adding trailing effects based on reference images from past projects.
  • Reference images helped guide design choices without starting from scratch; this collaborative use of prompts and visuals proved effective.

Multimodal Testing with Controllers

  • Testing involved integrating hand tracking alongside controller functionality within the Meta XR simulator for enhanced user interaction options.
  • Continuous prompt generation led to innovative ideas during development phases, demonstrating an adaptive design process that embraced experimentation.

Testing Controller Functionality in Meta XR

Initial Testing and Features

  • The speaker discusses the need for controller tracking in Meta XR, initially unaware of hand tracking capabilities. This led to the addition of controller support.
  • Testing was conducted using both a headset and a simulator after implementing controller support, which encouraged further feature development.
  • The speaker emphasizes committing code frequently for each small feature, allowing easy resets if issues arise during testing.

Enhancements and User Experience

  • UI improvements were made, transitioning from a simple label indicating missed objects to a more visually appealing progress bar.
  • Explosion effects were added using mini cubes instead of traditional VFX, enhancing visual feedback when objects are destroyed.

Feature Development and Game Creation

Inspiration for New Projects

  • Positive results from the prototype inspired the speaker to create various new games, highlighting the importance of experimenting with tools and prompts.
  • The consolidation of different prompts allowed for more polished or larger projects that were previously unfeasible due to lack of features.

Introduction to Immersity Logger

Simplifying Debugging Processes

  • Immersity Logger is introduced as a tool that simplifies logging during VR development by allowing custom logs without extensive manual entry.
  • Developers can add scripts in Unity that expose object properties at runtime, making it easier to troubleshoot issues while wearing a headset.

AI Integration in Debugging

  • Future updates will enable voice commands within Immersity Logger to make real-time fixes while debugging VR applications, streamlining the troubleshooting process significantly.

Agentic Workflows with Horizon OS

Documentation Accessibility

  • Discussion on how agentic workflows require information access; improved documentation is provided through friendly markdown structures available for developers.

Meta Quest Developer Hub and Unity AI Tools

Overview of Meta Quest Developer Hub

  • The Meta Quest Developer Hub is a lesser-known tool located under settings, specifically in the AI tools section. It facilitates various functionalities for developers.
  • Users can access an MCP (Meta Connection Point) through VS Code, which allows them to check device connections, retrieve logs, and access Unity documentation.

Live Demo of Unity AI with CoPlay

  • A live demonstration showcases how Unity AI operates alongside CoPlay, highlighting its capabilities and integration.
  • The demo involves running MCP servers and Cloak Code to connect with the MCP, exposing a URL for interaction.

Functionality of MCP Tools

  • Once CoPlay is downloaded and the server started, users can manage various tasks such as executing menu items or capturing screenshots using built-in tools.
  • The AI's ability to take screenshots helps it learn from its actions; this emphasizes the importance of prompt design in enhancing user experience.

VR-Specific Tools Provided by Meta

  • Meta offers specialized tools for VR development that simplify adding components like camera rigs or interactive canvases.
  • Developers are encouraged to utilize these resources to streamline their workflow when creating VR applications.

Command Line Interaction with MCP

  • Demonstrating command line usage shows how to execute commands within the MCP environment effectively.
  • A simple command retrieves information about game objects in the hierarchy, showcasing real-time interaction capabilities with Unity's MCP.

Unity MCP: Setting Up Grab Interactions

Initial Setup and Object Interaction

  • The demo involves a grabable object positioned about 1 meter away from the camera, scaled to 10%. Initial attempts at live demos often face challenges.
  • The system is executing various grab interactions, indicating that a comprehensive rig has been added, which includes multiple wizards for setup.
  • The necessary components for grabbing are identified; however, the actual grab component is missing. A hand grab interaction is required to proceed.

Adding Components and Configuration

  • Unity MCP will create a cube that is grabable by adding essential components automatically. Trust in the system's ability to configure correctly is emphasized.
  • Warnings during setup can be bypassed through player settings and MCP configuration options, allowing users to customize their experience based on comfort levels.

Testing the Simulator

  • After cooking for one minute, the cube becomes fully interactable. It remains positioned at 1 meter away with a scale of 10%.
  • Activating Meta XR Simulator requires using "activate" instead of "enable" to ensure proper functionality within Unity.

Troubleshooting Interactivity Issues

  • Despite having a working rig, the cube may not be visible due to gravity affecting its rigid body component. Adjustments are needed for visibility and interactivity.
  • Users can enable different controller settings but must ensure that objects like cubes have appropriate configurations (e.g., kinematic properties).

Final Adjustments and AI Assistance

  • To save time, making the cube kinematic temporarily allows testing without further delays while ensuring it remains interactable.
  • Attempts to grab the cube reveal issues with its current state; troubleshooting involves checking if it’s set as kinematic or if other components are misconfigured.

Resolving Component Errors

  • AI assistance is sought to fix problems related to component setups; errors indicate that certain properties may not be linked correctly.
  • The process of updating components continues as adjustments are made based on feedback from error messages regarding rigid body links.

Conclusion of Setup Process

  • Final checks confirm whether all necessary links between components exist before running tests again. If issues persist, alternative projects may provide insights into successful setups.

Troubleshooting Grab Interaction in Unity

Initial Setup and Issues

  • The speaker begins troubleshooting a grab interaction component in Unity, noting that it is not functioning as expected.
  • After adjusting settings on the grabable component, an error appears but then clears, indicating some progress has been made.
  • The issue seems to stem from the configuration of the grabable object; specifically, the pointable element and rigid body settings are highlighted as potential problems.

Identifying Configuration Problems

  • The speaker identifies missing elements in the setup of the grabable object and decides to delete a reference object used for comparison.
  • They mention waiting for components to finish launching before proceeding with manual adjustments instead of relying on AI assistance.

Demonstrating Functionality

  • Upon resolving issues with the grabable object, they reflect on how a screenshot could have expedited identifying problems during the demo.
  • The final prototype demo showcases various features created using AI, including an OVR camera rig and obstacles within Unity.

Exploring Features and Enhancements

Customization Options

  • A ring mesh generator allows for customization of depth and segments to create different shapes without breaking existing designs.
  • Shaders can be modified dynamically; for example, changing colors enhances visual appeal based on gameplay requirements.

Testing Gameplay Mechanics

  • The game runs smoothly in simulation mode where controller testing occurs; scoring is noted but limited due to simulator constraints.

Unity AI Assistant Overview

Integration with Cloud Code

  • Discussion shifts towards Unity's AI Assistant which operates similarly to Cloud Code but is integrated directly into Unity’s interface.

Script Generation Capabilities

  • Users can request specific actions like adding a red cube or creating scripts with defined parameters (e.g., rotation speed).

AI Functionality Insights

Real-time Coding Assistance

  • The assistant provides inline coding suggestions with color coding similar to markdown syntax, enhancing readability during script creation.

Resource Management Considerations

  • It’s important to note that using these AI features consumes credits from Unity Cloud services, impacting resource management for developers.

This structured approach captures key insights from the transcript while providing timestamps for easy navigation.

Introduction to Unity AI and Shader Development

Overview of Unity AI Features

  • The speaker demonstrates a simple example of using Unity AI, showcasing the addition of a rotator script to a red cube, which allows it to rotate at a specified speed.
  • The demonstration includes hitting play in Unity to visualize the rotating cube, indicating that multiple instances of Unity may affect performance.

Future Content Plans

  • The speaker mentions plans for creating more videos on advanced use cases related to Unity AI and shader development.

Discussion on Shader Optimization and VR Challenges

Personal Experiences with Shaders

  • A participant relates their struggles with shaders, expressing a desire for AI assistance in handling complex visual tasks while focusing on C# development.

Questions about AI Optimization

  • The host asks if the current state of AI tools for building shaders is optimized. The response indicates that optimization is still lacking but acknowledges available tools for performance enhancement.

Limitations in Current Tools

  • The speaker shares experiences with shader tools that struggled to provide effective solutions without additional context or information from the user.

Exploring Practical Applications and Costs

Hands-On Experimentation

  • A participant expresses interest in experimenting with VR interactions, specifically simulating hand collisions with objects like tables.

Performance Insights from Live Demos

  • During Q&A, the speaker reflects on testing an early version of MCP (Machine Control Protocol) within Unity, noting its pre-release status and recommending it as a playground for experimentation.

Cost Considerations When Using APIs

API Usage and Financial Implications

  • Discussion highlights potential costs associated with using API keys versus authentication methods; caution is advised regarding expenses incurred during development.

Continuous Improvement Observed

  • The speaker notes ongoing improvements in API versions released weekly, suggesting that while initial experiences were poor, recent updates have significantly enhanced usability.

Creating Audio Assets in Unity

Capabilities of Unity AI Components

  • Confirmation that Unity's AI components can create audio files such as sound effects (SFX) and music. The speaker emphasizes licensing considerations when presenting content.

Unity AI Offerings and Tools

Discussion on Cursor vs. Code Play

  • The speaker discusses the audio generation capabilities integrated into Unity's AI offerings, indicating a focus on exploring these features.
  • A question arises about the comparison between Cursor as an IDE and Code Play; the speaker confirms prior use of Cursor for scripting purposes.
  • The distinction is made that while Cursor functions as an IDE, Co-Create serves as a middleware communication platform (MCP) between various tools like VS Code and Rider.

Functionality of Co-Create

  • Co-Create acts as a bridge to facilitate interactions with game objects and hierarchy information within Unity, enhancing development efficiency.
  • The speaker notes that Unity's offering includes an integrated chat feature, making it more robust compared to Cursor’s functionalities.

Development Experience Insights

  • Reflecting on past experiences, the speaker shares challenges faced during prototype development for GDC talks, emphasizing rapid technological changes in tools.
  • Despite initial setbacks with tool functionality, the speaker highlights the importance of treating new technologies as experimental playgrounds.

Surprising Features in Game Development

  • The ability to create advanced shaders without external help was particularly impressive; this capability allowed for significant visual improvements in games without needing additional resources.

Use Cases for MCP Integration

  • The discussion shifts to connecting Unity MCP with other servers; Miro flowcharts are mentioned but not explored deeply due to unfamiliarity.

Exploring Open Source Framework Recommendations

Local vs. Cloud-Based Models

  • A technical inquiry about open-source frameworks suitable for quest design leads to a discussion on local versus cloud-based models; self-hosted models require substantial hardware resources which can be costly.

Personal Experiences with Local Models

  • The speaker mentions a friend who runs local models on high-end hardware but acknowledges limitations in familiarity with specific workflows related to local model usage.

What Innovations Can We Expect in the Coming Year?

Anticipated Developments in AI Tools

  • The speaker discusses the rapid advancements being made at Meta, particularly regarding developers adding new skills and features to existing tools.
  • A recommended resource is mentioned: a repository containing various "code agents" that showcase different skills, which was highlighted in earlier slides.
  • Upcoming tools include more command-line interface (CLI) options and enhancements like an immersive debugger with AI capabilities. Additionally, improvements are expected for the runtime optimizer, which will support agentic workflows.
  • The speaker notes ongoing efforts to optimize tools specifically for VR developers, indicating that significant behind-the-scenes work is currently underway based on industry feedback.

Conclusion of Discussion

  • The session wraps up with gratitude expressed towards Germán for his insights. The speaker expresses eagerness to explore the discussed tools further and share findings with others.
Video description

Ready to take your game to the next level? Here we'll cover advanced tooling secrets to accelerate your VR development cycle, squeeze maximum performance out of Quest, and utilize VR native features to their fullest potential. Includes a deep dive into AI-assisted development environment workflows that are changing how we build and iterate. - Meta Agentic Tools: https://github.com/meta-quest/agentic-tools - Unity AI Assistant (Beta): https://docs.unity3d.com/Packages/com.unity.ai.assistant@2.3/manual/index.html - Meta MCP Extensions: https://github.com/meta-quest/Unity-MCP-Extensions - Meta Quest hzdb CLI: https://www.npmjs.com/package/@meta-quest/hzdb 🔔 Subscribe for more XR Tutorials: https://www.youtube.com/@dilmerv 🔗 X: https://x.com/dilmerv 👥 Discord: https://discord.gg/dNMHBc8KdP 📸 Instagram : https://tiktok.com/@dilmerval 🥽 Learn & Get my XR Courses from: https://www.learnxr.io 👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) https://blog.learnxr.io #xr #metaverse #unity