Simulation Integration

Integrate Open-PAV with various simulation platforms to analyze automated vehicle behaviors.

Supported Platforms

  • Python-Based Basic Model: Simulate simple car-following behavior using a basic linear model.
  • SUMO: Implement Intelligent Driver Model (IDM) for SUMO-based simulations.
  • VISSIM: Implement Wiedemann-99 models with external driver models.
  • TorchScript (AI-Based Models): Deploy deep learning-based car-following models using PyTorch.

Python-Based Basic Model

A Basic Linear Model can be implemented using the following car-following equation:

a=kv(vlโˆ’vf)+kg(plโˆ’pfโˆ’vfโ‹…Td)+z

where: - vl,vf = velocities of the lead and following vehicle. - pl,pf = positions of the lead and following vehicle. - Td = time delay. - kv,kg,z = model parameters.

Example Simulation

Below is a plot of a simple highway simulation where a Tesla follows a lead vehicle.

![Basic Model Simulation]

To run the simulation: refer to Quick Start


SUMO Integration

The Intelligent Driver Model (IDM) is supported in SUMO for car-following simulations.

Steps to Integrate IDM in SUMO

  1. Install SUMO if not already installed: bash sudo apt-get install sumo sumo-tools sumo-doc # Ubuntu
  2. Prepare SUMO Configuration:
  3. Open the SUMO configuration file and add the IDM car-following model. xml <vType id="IDM" accel="XX" decel="XX" sigma="XX" length="5" minGap="2.5" maxSpeed="33.3" guiShape="passenger"/>
  4. Assign the IDM Model to Vehicles: xml <vehicle id="veh0" type="IDM" route="route0" depart="0" />
  5. Run SUMO Simulation: bash sumo -c simulation.sumocfg

๐Ÿ”— More details on IDM in SUMO


VISSIM Integration (Wiedemann-99 Model)

VISSIM supports the Wiedemann-99 car-following model, allowing for custom driver models.

Steps to Integrate Wiedemann-99 in VISSIM

  1. Open PTV VISSIM.
  2. Build the road network and configure the simulation.
  3. Set Personalized Driving Model:
  4. Open Visual Studio and compile the driver model: bash Open car_follow_model.vcxproj and build
  5. This generates a DriverModel.dll file.
  6. Load the Driver Model in VISSIM:
  7. Open the Vehicle Types interface.
  8. Add four vehicle types and link the DLL file.
  9. Set External Driver Model and browse to the DriverModel.dll file.
  10. Run the Simulation:
  11. Configure evaluation settings in Evaluation-Configuration.
  12. Start the VISSIM simulation.

๐Ÿ”— Official Wiedemann-99 Documentation


AI-Based Model Integration (TorchScript)

For deep learning-based models, Open-PAV supports TorchScript inference models.

Steps to Deploy AI-Based Car-Following Models

  1. Train a Neural Network-Based Model using PyTorch. ```python import torch

class CarFollowingModel(torch.nn.Module): def init(self): super().init() self.linear = torch.nn.Linear(4, 1)

   def forward(self, x):
       return self.linear(x)

model = CarFollowingModel() torch.save(model, "car_following_model.pt") 2. **Convert to TorchScript for Deployment**:python model = torch.jit.script(model) model.save("car_following_model_scripted.pt") 3. **Run AI-Based Simulation**:python model = torch.jit.load("car_following_model_scripted.pt") input_data = torch.tensor([[v_l, v_f, p_l, p_f]]) # Example inputs predicted_acceleration = model(input_data) ```

This approach allows AI-enhanced car-following models to be deployed efficiently.