CoSimo is a Python library for managing and building extensible multi-simulation and co-simulation pipelines. It supports local and remote simulators and facilitates the development of complex simulations by combining multiple components in a structured pipeline.
- Dynamic pipelines: Combine multiple simulators to form multi-step processes.
- Extensible architecture: Implement custom simulators by extending the
ISimulator
interface. - Remote simulation support: Use Pyro4 to run simulations on remote servers and connect them to your local pipeline.
- Data flow control: Seamlessly pass data between simulators in the pipeline.
-
Clone the repository:
git clone https://github.com/lgiannantoni/CoSimo.git cd CoSimo
-
Install dependencies:
pip install -r requirements.txt
-
Install the package:
python setup.py install
This tutorial will guide you through the process of building a complete simulation pipeline.
First, we will implement a simulator that generates random data at each step.
File: data_sim.py
from cosimo.simulator import ISimulator
import random
class DataGenerator(ISimulator):
def step(self, *args, **kwargs):
data = {"value": random.randint(0, 100)}
print(f"DataGenerator output: {data}")
return (data,), kwargs
def reset(self):
print("DataGenerator reset.")
def add_model(self, *args, **kwargs):
pass
Now, let's create a second simulator that processes the data generated by DataGenerator
.
File: processor_sim.py
from cosimo.simulator import ISimulator
class DataProcessor(ISimulator):
def step(self, data, *args, **kwargs):
processed_data = {"value": data["value"] * 2}
print(f"DataProcessor output: {processed_data}")
return (processed_data,), kwargs
def reset(self):
print("DataProcessor reset.")
def add_model(self, *args, **kwargs):
pass
Finally, create a simulator that visualizes the processed data.
File: visualizer_sim.py
from cosimo.simulator import ISimulator
class DataVisualizer(ISimulator):
def step(self, data, *args, **kwargs):
print(f"DataVisualizer output (final result): {data}")
return (data,), kwargs
def reset(self):
print("DataVisualizer reset.")
def add_model(self, *args, **kwargs):
pass
Set up a configuration file that specifies the simulators in the pipeline.
File: simulation_conf.py
SIM_CONFIG = {
"data_generator": {"python": "data_sim:DataGenerator"},
"data_processor": {"python": "processor_sim:DataProcessor"},
"data_visualizer": {"python": "visualizer_sim:DataVisualizer"}
}
Run the simulation:
File: run_simulation.py
from cosimo.simulation import Simulation
from simulation_conf import SIM_CONFIG
# Set up the simulation with the specified configuration
Simulation.setup(**SIM_CONFIG)
# Run the simulation for 5 steps
Simulation.start(sim_steps=5)
DataGenerator output: {'value': 42}
DataProcessor output: {'value': 84}
DataVisualizer output (final result): {'value': 84}
DataGenerator output: {'value': 7}
DataProcessor output: {'value': 14}
DataVisualizer output (final result): {'value': 14}
We will replace the DataProcessor
simulator with a remote version.
from cosimo.simulator import ISimulator
class RemoteDataProcessor(ISimulator):
def step(self, data, *args, **kwargs):
processed_data = {"value": data["value"] * 3}
print(f"RemoteDataProcessor output: {processed_data}")
return (processed_data,), kwargs
if __name__ == "__main__":
RemoteDataProcessor.serve(host="0.0.0.0", port=9090)
SIM_CONFIG = {
"data_generator": {"python": "data_sim:DataGenerator"},
"remote_processor": {"remote": "user@remote-host:9090"},
"data_visualizer": {"python": "visualizer_sim:DataVisualizer"}
}
-
Copy
remote_processor_sim.py
to the remote server. -
Run it:
python remote_processor_sim.py
-
Execute the local pipeline as before, and it will automatically connect to the remote simulator.
This package contains the essential components of CoSimo: simulators, pipelines, utility functions, and the main simulation lifecycle manager.
-
Class
Simulation
Manages the entire simulation process, including initialization, execution, and termination.Key methods and parameters:
setup(**kwargs)
- Description: Initializes the simulation pipeline based on the given configuration.
- Parameters:
kwargs
(dict
): Key-value pairs defining the configuration for each simulator in the pipeline. Example:{ "simulator_name": {"python": "module_name:ClassName"}, "remote_simulator": {"remote": "user@host:port"} }
start(sim_steps)
- Description: Executes the simulation for a specified number of steps.
- Parameters:
sim_steps
(int
): Number of steps to run the simulation.
shutdown()
- Description: Shuts down the pipeline and releases resources.
-
Class
ISimulator
(Abstract Base Class)
Base class for all simulators. Defines the required interface for implementing custom simulators.Abstract methods and parameters:
step(*args, **kwargs)
- Description: Defines the behavior of the simulator for one step of the simulation.
- Parameters:
args
(tuple
): Positional arguments passed to the simulator.kwargs
(dict
): Keyword arguments containing configuration or state information.
reset()
- Description: Resets the simulator’s internal state.
add_model(*args, **kwargs)
- Description: Adds a model to the simulator for internal use.
- Parameters:
args
,kwargs
: Parameters for model configuration.
-
Class
Proxy
Connects to and manages remote simulators using Pyro4.Key methods and parameters:
__init__(user, host, port, sim_class, sim_module, sim_name=None)
- Description: Initializes a connection to a remote simulator.
- Parameters:
user
(str
): Username for the remote machine.host
(str
): Remote server address.port
(int
): Port number for the simulator.sim_class
(str
): Name of the simulator class.sim_module
(str
): Module where the simulator is defined.sim_name
(str
, optional): Optional name for the simulator instance.
-
Class
Pipeline
Manages the simulation pipeline, chaining simulators and handling data flow between them.Key methods and parameters:
__add__(other)
- Description: Adds a new simulator or merges another pipeline.
- Parameters:
other
(Pipeline
,ISimulator
, orProxy
): The object to add to the pipeline.
do(*args, **kwargs)
- Description: Executes the pipeline.
- Parameters:
args
,kwargs
: Input data for the first simulator in the pipeline.
set_level(level)
- Description: Sets the logging level for the pipeline.
- Parameters:
level
(Level
): Logging level (e.g.,Level.DEBUG
).
-
Class
AdvEnum
Extended enumeration class with additional helper methods and serialization support.Key methods:
list()
: Returns all enum values as a list.random()
: Returns a random enum value.
-
Class
UniqueIdMap
Generates unique IDs for objects usingUUID
.
Contributions are welcome! Fork the repository, create a feature branch, and submit a pull request.
Leonardo Giannantoni
Politecnico di Torino - Control and Computer Engineering Department
PhD Candidate
E: leonardo.giannantoni@polito.it
P: +39 011 090 7191
M: +39 377 283 4499
A: Corso Duca degli Abruzzi 24 10129 Torino Italy
- Leonardo Giannantoni, et al. "A methodology for co-simulation-based optimization of biofabrication protocols". IWBBIO 2022.
The co-simulation methodology presented in this paper is built on top of CoSimo.
APA
Giannantoni, L. (2021). CoSimo (Version 1.0.0) [Computer software]. https://github.com/lgiannantoni/CoSimo
BibTeX
@software{Giannantoni_CoSimo_2021,
author = {Giannantoni, Leonardo},
month = {1},
title = {{cosimo}},
url = {https://github.com/lgiannantoni/CoSimo},
version = {1.0.0},
year = {2021}
}
Coherence is licensed under the MIT license. A copy of this license is included in the file LICENSE.