Fetching metadata from the HF Docker repository...
Upload 12 files
5d490e3
-
0 Bytes
Create models/Readme.md
decisionTree.pkl
Detected Pickle imports (6)
- "numpy.core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.pipeline.Pipeline",
- "numpy.ndarray",
- "numpy.dtype"
How to fix it?
619 kB
Upload 12 files
decisionTree_ica.pkl
Detected Pickle imports (6)
- "numpy.core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.pipeline.Pipeline",
- "numpy.ndarray",
- "numpy.dtype"
How to fix it?
617 kB
Upload 12 files
decisionTree_pca.pkl
Detected Pickle imports (6)
- "numpy.core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.pipeline.Pipeline",
- "numpy.ndarray",
- "numpy.dtype"
How to fix it?
934 kB
Upload 12 files
knn_pca.pkl
Detected Pickle imports (6)
- "numpy.dtype",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.core.multiarray.scalar",
- "numpy.ndarray",
- "sklearn.pipeline.Pipeline",
- "sklearn.preprocessing._data.StandardScaler"
How to fix it?
69.3 MB
Upload 12 files
logReg.pkl
Detected Pickle imports (6)
- "numpy.core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.pipeline.Pipeline",
- "numpy.ndarray",
- "numpy.dtype"
How to fix it?
48.2 kB
Upload 12 files
logReg_ica.pkl
Detected Pickle imports (6)
- "numpy.core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.pipeline.Pipeline",
- "numpy.ndarray",
- "numpy.dtype"
How to fix it?
48.2 kB
Upload 12 files
logReg_pca.pkl
Detected Pickle imports (6)
- "numpy.core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.pipeline.Pipeline",
- "numpy.ndarray",
- "numpy.dtype"
How to fix it?
12.8 kB
Upload 12 files
randomForest.pkl
Detected Pickle imports (6)
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.pipeline.Pipeline",
- "numpy.core.multiarray.scalar",
- "numpy.ndarray",
- "sklearn.preprocessing._data.StandardScaler",
- "numpy.dtype"
How to fix it?
67.6 MB
Upload 12 files
randomForest_ica.pkl
Detected Pickle imports (6)
- "sklearn.pipeline.Pipeline",
- "numpy.ndarray",
- "numpy.core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler",
- "numpy.dtype",
- "joblib.numpy_pickle.NumpyArrayWrapper"
How to fix it?
67.6 MB
Upload 12 files
xgboost.pkl
Detected Pickle imports (6)
- "numpy.core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.pipeline.Pipeline",
- "numpy.ndarray",
- "numpy.dtype"
How to fix it?
1.9 MB
Upload 12 files
xgboost_ica.pkl
Detected Pickle imports (6)
- "numpy.core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.pipeline.Pipeline",
- "numpy.ndarray",
- "numpy.dtype"
How to fix it?
1.92 MB
Upload 12 files
xgboost_pca.pkl
Detected Pickle imports (6)
- "numpy.core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.pipeline.Pipeline",
- "numpy.ndarray",
- "numpy.dtype"
How to fix it?
1.94 MB
Upload 12 files