Autonomous systems are only as powerful as the scenarios they can model. Yet creating realistic, controllable driving scenes remains a major bottleneck, especially when flexibility across vehicle types and trajectories is limited.

Research from NEC Laboratories America introduces a new approach that significantly expands the possibilities in controllable scene generation.

Driving the Future of Scene Editing with HorizonForge

The research paper, HorizonForge: Driving Scene Editing with Any Trajectories and Any Vehicles, introduces a flexible framework for editing driving scenes with precise control over vehicle behavior and identity while preserving realism. NEC Laboratories America intern Yifan Wang (Stony Brook University) from our Summer 2025 class leads this work in collaboration with Francesco Pittaluga (NEC Laboratories America, Inc.); Zaid Tasneem (NEC Laboratories America, Inc.); Chenyu You (Stony Brook University); Manmohan Chandraker (NEC Laboratories America, Inc.; University of California, San Diego); Ziyu Jiang (NEC Laboratories America, Inc.).

A New Perspective: From Fixed Scenarios to Fully Controllable Worlds

Traditional scene generation methods often lock users into predefined vehicle types or limited motion patterns, restricting their usefulness in real-world testing and development. HorizonForge challenges this constraint by enabling simultaneous control over what appears in a scene and how it moves.

“Current scene generation approaches often force a tradeoff between realism and controllability. With HorizonForge, we show that it is possible to achieve both, enabling users to define arbitrary trajectories and vehicle types within a unified framework. This opens the door to more flexible and scalable simulation environments that better reflect real-world complexity.” – Yifan Wang, Summer Intern, Stony Brook University

This shift reframes scene generation from a static or semi-controlled process into a fully programmable environment, where developers can design edge cases, rare events, and targeted scenarios with precision. Unlike prior approaches that constrain either motion or appearance, HorizonForge unifies both within a single framework, enabling simultaneous control without retraining or sacrificing scene coherence.

What Is HorizonForge?

HorizonForge is a generative framework designed to edit and synthesize driving scenes with fine-grained control over vehicles and motion. It builds on advances in generative AI and scene understanding to produce realistic outputs while honoring user-defined constraints.

At a high level, HorizonForge enables:

  • Trajectory control: Define arbitrary paths for vehicles within a scene
  • Vehicle flexibility: Insert and modify different vehicle types without retraining
  • Scene consistency: Maintain realistic interactions between vehicles and environments

By decoupling motion and appearance while preserving coherence, the system allows users to construct complex scenarios that were previously difficult or impossible to generate. This design allows HorizonForge to generalize across diverse driving scenarios, supporting rapid scenario iteration while maintaining consistency across complex, multi-vehicle interactions.

Real-World Implications

HorizonForge has immediate relevance across industries that depend on high-quality simulation and visual realism:

  • Autonomous driving development: Generate rare or dangerous edge cases safely, improving model robustness
  • Simulation at scale: Rapidly create diverse datasets without manual annotation or expensive data collection
  • Digital twins and smart cities: Model traffic patterns with greater flexibility and realism
  • Content creation and visualization: Enable dynamic editing of driving scenes for media, gaming, and training

By lowering the barrier to creating controlled yet realistic environments, HorizonForge accelerates both experimentation and deployment. In practice, this enables teams to generate targeted edge cases and scenario variations on demand, reducing reliance on costly real-world data collection and accelerating development cycles.

Why This Matters Now

As AI systems move from research to real-world deployment, the demand for high-quality, diverse, and controllable data is growing rapidly. Autonomous vehicles, robotics, and urban planning systems all require exposure to complex scenarios that are difficult to capture in the real world. HorizonForge addresses key questions facing the industry:

  • How can we simulate rare but critical events without waiting for them to occur?
  • How do we scale scenario generation without sacrificing realism?
  • Can we give developers precise control over both behavior and appearance in generated environments?

By answering these questions, HorizonForge aligns with broader trends in generative AI, simulation-driven development, and data-centric system design.

Final Thoughts

HorizonForge represents a meaningful step toward fully controllable, high-fidelity scene generation. By combining flexibility with realism, it enables a new class of tools for simulation, testing, and visualization. This work underscores NEC Laboratories America’s leadership in advancing applied AI research that bridges theory and real-world impact. As simulation becomes central to the development of intelligent systems, approaches like HorizonForge will play a critical role in shaping how those systems are built, tested, and trusted.

About the Authors