gninatorch.gnina module

gninatorch.gnina.load_gnina_model(gnina_model: str, dimension: float = 23.5, resolution: float = 0.5)[source]

Load GNINA model.

Parameters
  • gnina_model (str) – GNINA model name

  • dimension (float) – Grid dimension (in Angstrom)

  • resolution (float) – Grid resolution (in Angstrom)

gninatorch.gnina.load_gnina_models(model_names: Iterable[str], dimension: float = 23.5, resolution: float = 0.5)[source]

Load GNINA models.

Parameters

model_names (Iterable[str]) – List of GNINA model names

gninatorch.gnina.main(args)[source]

Run inference with GNINA pre-trained models.

Parameters

args (Namespace) – Parsed command line arguments

Notes

Models are used in evaluation mode, which is essential for the dense models since they use batch normalisation.

gninatorch.gnina.options(args: Optional[List[str]] = None)[source]

Define options and parse arguments.

Parameters

args (Optional[List[str]]) – List of command line arguments

gninatorch.gnina.setup_gnina_model(cnn: str = 'default', dimension: float = 23.5, resolution: float = 0.5) Union[Module, bool][source]

Load model or ensemble of models.

Parameters
  • cnn (str) – CNN model name

  • dimension (float) – Grid dimension

  • resolution (float) – Grid resolution

Returns

Model or ensemble of models

Return type

nn.Module

Notes

Mimicks GNINA CLI. The model is returned in evaluation mode. This is essential to use the dense model correctly (due to the nn.BatchNorm layers).