A.19 Python Libraries for Predefined Conda Environment

Compliance Studio comes with predefined Conda environments as follows:

  • default_<CS version>
  • ml4aml_<CS version>
  • sane_<CS version>

Table A-9 Default Conda Python Environment

Package Version
pip 24.3.1
setuptools 78.1.1
attrs 24.3.0
certifi 2024.12.14
charset-normalizer 3.4.0
cloudpickle 3.1.0
cycler 0.12.1
fonttools 4.55.3
graphviz 0.20.3
greenlet 3.1.1
idna 3.1
Jinja2 3.1.6
joblib 1.4.2
kiwisolver 1.4.7
llvmlite 0.43.0
MarkupSafe 3.0.2
numba 0.60.0
packaging 24.2
patsy 1.0.1
Pillow 11.0.0
psutil 6.1.1
pyaml 24.12.1
pyparsing 3.2.0
python-dateutil 2.9.0.post0
pytz 2024.2
PyYAML 6.0.2
six 1.17.0
sklearn 0
slicer 0.0.8
tabulate 0.9.0
threadpoolctl 3.5.0
tqdm 4.67.1
urllib3 2.2.3
wheel 0.45.1
Babel 2.16.0
docutils 0.21.2
imagesize 1.4.1
importlib-metadata 8.5.0
Pygments 2.18.0
snowballstemmer 2.2.0
alabaster 0.7.16
zipp 3.21.0
sphinxcontrib-applehelp 2.0.0
sphinxcontrib-devhelp 2.0.0
sphinxcontrib-htmlhelp 2.1.0
sphinxcontrib-jquery 4.1
sphinxcontrib-jsmath 1.0.1
sphinxcontrib-qthelp 2.0.0
sphinxcontrib-serializinghtml 2.0.0
sphinx_rtd_theme 3.0.2
eli5 0.13.0
xgboost 2.1.4
scikit-learn 1.6.0
seaborn 0.13.2
imbalanced-learn 0.12.4
py4j 0.10.9.8
scikit-optimize 0.10.2
statsmodels 0.14.4
pyod 2.0.2
requests 2.32.3
minisom 2.3.3
scipy 1.13.1
sqlalchemy 2.0.36
oracledb 2.5.1
matplotlib 3.9.4
pandas 2.2.3
numpy 2.0.2
editdistance 0.8.1
pyjnius 1.6.1
cython 3.0.11
matplotlib-venn 1.1.1
cx-oracle 8.3.0
sphinx 7.4.7
shap 0.46.0
PDPbox 0.3.0

Table A-10 ml4aml Conda Environment

Package Version
pip 24.3.1
setuptools 78.1.1
attrs 24.3.0
certifi 2024.12.14
charset-normalizer 3.4.0
cloudpickle 3.1.0
cycler 0.12.1
fonttools 4.55.3
graphviz 0.20.3
greenlet 3.1.1
idna 3.1
Jinja2 3.1.6
joblib 1.4.2
kiwisolver 1.4.7
llvmlite 0.43.0
MarkupSafe 3.0.2
numba 0.60.0
packaging 24.2
patsy 1.0.1
Pillow 11.0.0
psutil 6.1.1
pyaml 24.12.1
pyparsing 3.2.0
python-dateutil 2.9.0.post0
pytz 2024.2
PyYAML 6.0.2
six 1.17.0
sklearn 0
slicer 0.0.8
tabulate 0.9.0
threadpoolctl 3.5.0
tqdm 4.67.1
urllib3 2.2.3
wheel 0.45.1
Babel 2.16.0
docutils 0.21.2
imagesize 1.4.1
importlib-metadata 8.5.0
Pygments 2.18.0
snowballstemmer 2.2.0
alabaster 0.7.16
zipp 3.21.0
setuptools_scm 8.2.0
sphinxcontrib-applehelp 2.0.0
sphinxcontrib-devhelp 2.0.0
sphinxcontrib-htmlhelp 2.1.0
sphinxcontrib-jquery 4.1
sphinxcontrib-jsmath 1.0.1
sphinxcontrib-qthelp 2.0.0
sphinxcontrib-serializinghtml 2.0.0
sphinx_rtd_theme 3.0.2
eli5 0.13.0
xgboost 2.1.4
scikit-learn 1.6.0
seaborn 0.13.2
imbalanced-learn 0.12.4
py4j 0.10.9.8
scikit-optimize 0.10.2
statsmodels 0.14.4
pyod 2.0.2
requests 2.32.3
minisom 2.3.3
scipy 1.13.1
sqlalchemy 2.0.36
oracledb 2.5.1
matplotlib 3.9.4
pandas 2.2.3
numpy 2.0.2
editdistance 0.8.1
pyjnius 1.6.1
cython 3.0.11
matplotlib-venn 1.1.1
cx-oracle 8.3.0
sphinx 7.4.7
shap 0.46.0
PDPbox 0.3.0
pyarrow 16.1.0
pydantic 2.7.2
annotated-types 0.7.0
pydantic_core 2.18.3
typing_extensions 4.12.2
modin 0.30.0
fsspec 2025.2.0
evidently 0.4.25
dynaconf 3.2.10
iterative-telemetry 0.0.10
litestar 2.16.0
nltk 3.9.1
plotly 6.0.0
rich 13.9.4
typer 0.15.2
typing-inspect 0.9.0
ujson 5.10.0
uvicorn 0.34.0
watchdog 6.0.0
whylogs 1.3.32
platformdirs 3.11.0
protobuf 6.30.0
types-requests 2.32.0.20250306
typing_extensions 4.12.2
whylabs-client 0.6.16
whylogs-sketching 3.4.1.dev3
pybars3 0.9.7
PyMeta3 0.5.1
onnx 1.16.0
IPython 8.14.0
backcall 0.2.0
decorator 5.2.1
jedi 0.19.2
matplotlib-inline 0.1.7
pexpect 4.9.0
pickleshare 0.7.5
prompt_toolkit 3.0.50
stack-data 0.6.3
traitlets 5.14.3
typing_extensions 4.12.2
aif360 0.6.1
optuna 3.2.0
alembic 1.15.1
cmaes 0.11.1
colorlog 6.9.0
oracle-guardian-ai 1.0.1

Note:

The Pyspark python package is not part of the default environment.

Install Pyspark for ml4aml Conda Python Environment

To use this feature, download the pyspark python package from the deployed spark distribution and install the package in the conda python environment of the Compliance Studio.

To install the pyspark python package, follow these steps:
  1. Log in to the UNIX machine where Compliance Studio is installed.
  2. Navigate to <COMPLAINACE_STUDIO_INSTALLED_PATH>/deployed/python_packages/ ml4aml/bin directory.
  3. If the machine is connected to the internet then install by executing the following command:

    ./python3 -m pip install pyspark

  4. If the machine is not connected to the internet then download the available package from the deployed spark.
  5. Copy the package to any location in the UNIX machine and install by executing the following commands:
    /python3 -m pip install pyspark --no-index --findlinks
    $FULL_PATH_INCLUDING_PYSPARK_PACKAGE_NAME

Table A-11 Sane Conda Environment

Package Version
pip 23.2.1
setuptools 78.1.1
catboost 1.2
certifi 2024.12.14
cffi 1.15.1
conda-pack 0.6.0
contourpy 1.1.0
cryptography 45.0.3
cx-Oracle 8.3.0
cycler 0.11.0
deprecation 2.1.0
fonttools 4.55.3
graphviz 0.20.1
importlib-resources 5.12.0
jaro-winkler 2.0.3
jellyfish 0.11.2
pyjnius 1.5.0
cython 0.29.36
kiwisolver 1.4.4
Levenshtein 0.21.1
matplotlib 3.7.1
numpy 1.24.4
oracledb 1.3.2
packaging 21.3
pandas 1.5.3
Pillow 10.2.0
plotly 5.15.0
py4j 0.10.9.5
pycparser 2.21
pyparsing 3.1.0
python-dateutil 2.8.2
python-Levenshtein 0.21.1
pytz 2021.3
pyxDamerauLevenshtein 1.7.1
rapidfuzz 3.1.1
retrying 1.3.4
scipy 1.11.0
setuptools 67.8.0
six 1.16.0
tenacity 8.2.2
textdistance 4.5.0
urllib3 1.26.20
wheel 0.38.4
zipp 3.19.1