Oisin Mac Aodha
commited on
Commit
Β·
42065fc
1
Parent(s):
505e401
update readme
Browse files
README.md
CHANGED
@@ -1,73 +1,9 @@
|
|
1 |
-
|
2 |
-
|
3 |
-
|
4 |
-
|
5 |
-
|
6 |
-
|
7 |
-
|
8 |
-
|
9 |
-
<sup>Above we visualize predictions from one of our SINR models trained on data from [iNaturalist](inaturalist.org). On the left we show the learned species embedding space, where each point represents a different species. On the right we see the predicted range of the species corresponding to the red dot on the left.<sup>
|
10 |
-
|
11 |
-
## π Getting Started
|
12 |
-
|
13 |
-
#### Installing Required Packages
|
14 |
-
|
15 |
-
1. We recommend using an isolated Python environment to avoid dependency issues. Install the Anaconda Python 3.9 distribution for your operating system from [here](https://www.anaconda.com/download).
|
16 |
-
|
17 |
-
2. Create a new environment and activate it:
|
18 |
-
```bash
|
19 |
-
conda create -y --name sinr_icml python==3.9
|
20 |
-
conda activate sinr_icml
|
21 |
-
```
|
22 |
-
|
23 |
-
3. After activating the environment, install the required packages:
|
24 |
-
```bash
|
25 |
-
pip3 install -r requirements.txt
|
26 |
-
```
|
27 |
-
|
28 |
-
#### Data Download and Preparation
|
29 |
-
Instructions for downloading the data in `data/README.md`.
|
30 |
-
|
31 |
-
## πΊοΈ Generating Predictions
|
32 |
-
To generate predictions for a model in the form of an image, run the following command:
|
33 |
-
```bash
|
34 |
-
python viz_map.py --taxa_id 130714
|
35 |
-
```
|
36 |
-
Here, `--taxa_id` is the id number for a species of interest from [iNaturalist](https://www.inaturalist.org/taxa/130714). If you want to generate predictions for a random species, add the `--rand_taxa` instead.
|
37 |
-
|
38 |
-
Note, before you run this command you need to first download the data as described in `data/README.md`. In addition, if you want to evaluate some of the pretrained models from the paper, you need to download those first and place them at `sinr/pretrained_models`. See `web_app/README.md` for more details.
|
39 |
-
|
40 |
-
There is also an interactive browser-based demo available in `web_app`.
|
41 |
-
|
42 |
-
## π
Training and Evaluating Models
|
43 |
-
|
44 |
-
To train and evaluate a model, run the following command:
|
45 |
-
```bash
|
46 |
-
python train_and_evaluate_models.py
|
47 |
-
```
|
48 |
-
|
49 |
-
#### Hyperparameters
|
50 |
-
Common parameters of interest can be set within `train_and_evaluate_models.py`. All other parameters are exposed in `setup.py`.
|
51 |
-
|
52 |
-
#### Outputs
|
53 |
-
By default, trained models and evaluation results will be saved to a folder in the `experiments` directory. Evaluation results will also be printed to the command line.
|
54 |
-
|
55 |
-
#### Interactive Model Visualizer
|
56 |
-
To visualize range predictions from pretrained SINR models, please follow the instructions in `web_app/README.md`.
|
57 |
-
|
58 |
-
## π Acknowledgements
|
59 |
-
This project was enabled by data from the Cornell Lab of Ornithology, The International Union for the Conservation of Nature, iNaturalist, NASA, USGS, JAXA, CIESIN, and UC Merced. We are especially indebted to the [iNaturalist](inaturalist.org) and [eBird](https://ebird.org) communities for their data collection efforts. We also thank Matt Stimas-Mackey and Sam Heinrich for their help with data curation. This project was funded by the [Climate Change AI Innovation Grants](https://www.climatechange.ai/blog/2022-04-13-innovation-grants) program, hosted by Climate Change AI with the support of the Quadrature Climate Foundation, Schmidt Futures, and the Canada Hub of Future Earth. This work was also supported by the Caltech Resnick Sustainability Institute and an NSF Graduate Research Fellowship (grant number DGE1745301).
|
60 |
-
|
61 |
-
If you find our work useful in your research please consider citing our paper.
|
62 |
-
```
|
63 |
-
@inproceedings{SINR_icml23,
|
64 |
-
title = {{Spatial Implicit Neural Representations for Global-Scale Species Mapping}},
|
65 |
-
author = {Cole, Elijah and Van Horn, Grant and Lange, Christian and Shepard, Alexander and Leary, Patrick and Perona, Pietro and Loarie, Scott and Mac Aodha, Oisin},
|
66 |
-
booktitle = {ICML},
|
67 |
-
year = {2023}
|
68 |
-
}
|
69 |
-
```
|
70 |
-
|
71 |
-
## π Disclaimer
|
72 |
-
Extreme care should be taken before making any decisions based on the outputs of models presented here. Our goal in this work is to demonstrate the promise of large-scale representation learning for species range estimation, not to provide definitive range maps. Our models are trained on biased data and have not been calibrated or validated beyond the experiments illustrated in the paper.
|
73 |
|
|
|
1 |
+
title: SINR
|
2 |
+
emoji: π₯
|
3 |
+
colorFrom: gray
|
4 |
+
colorTo: yellow
|
5 |
+
sdk: gradio
|
6 |
+
sdk_version: 3.14.0
|
7 |
+
app_file: app.py
|
8 |
+
pinned: false
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
9 |
|