Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 2 additions & 16 deletions .github/workflows/mr_ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,8 @@
name: Units Tests

on:
pull_request:
branches:
- main
# Manual trigger only
workflow_dispatch:

# Cancel existing tests on the same PR if a new commit is added to a pull request
concurrency:
Expand Down Expand Up @@ -39,7 +38,6 @@ jobs:
- name: Install dependencies
run: |
python -m pip install ".[dev]"
python -m pip install 'git+https://github.com/rwood-97/piffle.git@iiif_dataclasses'
python -m pip install pytest-cov

- name: Quality Assurance
Expand All @@ -52,15 +50,3 @@ jobs:
- name: Test with pytest
run: |
python -m pytest ./tests --ignore=tests/test_text_spotting/


- name: Upload coverage to Codecov
uses: codecov/codecov-action@v4
with:
token: ${{ secrets.CODECOV_TOKEN }}
directory: ./coverage/reports/
env_vars: OS,PYTHON
fail_ci_if_error: false
files: ./coverage.xml,!./cache
flags: unittests
name: codecov-umbrella
25 changes: 6 additions & 19 deletions .github/workflows/mr_ci_text_spotting.yml
Original file line number Diff line number Diff line change
Expand Up @@ -50,12 +50,10 @@ jobs:
python -m pip install numpy==1.26.4 torch==2.2.2 torchvision==0.17.2 -f https://download.pytorch.org/whl/torch_stable.html
python -m pip install ".[dev]"
python -m pip install pytest-cov
python -m pip install 'git+https://github.com/facebookresearch/detectron2.git'
python -m pip install 'git+https://github.com/maps-as-data/DeepSolo.git'
python -m pip install 'git+https://github.com/maps-as-data/DPText-DETR.git'
python -m pip install 'git+https://github.com/maps-as-data/MapTextPipeline.git'
python -m pip install 'git+https://github.com/rwood-97/piffle.git@iiif_dataclasses'

python -m pip install --no-build-isolation 'git+https://github.com/facebookresearch/detectron2.git'
python -m pip install --no-build-isolation 'git+https://github.com/maps-as-data/DeepSolo.git'
python -m pip install --no-build-isolation 'git+https://github.com/maps-as-data/DPText-DETR.git'
python -m pip install --no-build-isolation 'git+https://github.com/maps-as-data/MapTextPipeline.git'

- name: Clone DPText-DETR
run: |
Expand All @@ -71,22 +69,11 @@ jobs:

- name: Hugging Face CLI
run: |
pip install -U "huggingface_hub[cli]"
pip install -U "huggingface-hub[cli]>=0.30.0,<0.34.0"
huggingface-cli download rwood-97/DPText_DETR_ArT_R_50_poly art_final.pth --local-dir .
huggingface-cli download rwood-97/DeepSolo_ic15_res50 ic15_res50_finetune_synth-tt-mlt-13-15-textocr.pth --local-dir .
huggingface-cli download rwood-97/MapTextPipeline_rumsey rumsey-finetune.pth --local-dir .

- name: Test with pytest
run: |
python -m pytest --cov=./ --cov-report=xml ./tests

- name: Upload coverage to Codecov
uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
directory: ./coverage/reports/
env_vars: OS,PYTHON
fail_ci_if_error: false
files: ./coverage.xml,!./cache
flags: unittests
name: codecov-umbrella
python -m pytest ./tests
1 change: 0 additions & 1 deletion .github/workflows/mr_pip_ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,6 @@ jobs:
- name: Install dependencies
run: |
python -m pip install mapreader[dev]
python -m pip install 'git+https://github.com/rwood-97/piffle.git@iiif_dataclasses'

- name: Quality Assurance
run: |
Expand Down
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,10 @@ The following table shows which versions of MapReader are compatible with which
## Pre-release
_Add new changes here_

### Added

- Added `piffle` package as dependency ([#575](https://github.com/maps-as-data/MapReader/pull/575))

## [v1.8.1](https://github.com/Living-with-machines/MapReader/releases/tag/v1.8.1) (2025-08-11)

### Added
Expand Down
6 changes: 0 additions & 6 deletions docs/source/using-mapreader/step-by-step-guide/1-download.rst
Original file line number Diff line number Diff line change
Expand Up @@ -526,12 +526,6 @@ For more information on IIIF, see their documentation `here <https://iiif.io/>`_

MapReader accepts any IIIF manifest which is compliant with the IIIF Presentation API (version `2 <https://iiif.io/api/presentation/2.1/>`__ or `3 <https://iiif.io/api/presentation/3.0/>`__).

First, install piffle using the command below:

.. code-block:: python

pip install piffle@git+https://github.com/rwood-97/piffle.git@iiif_dataclasses


IIIFDownloader
~~~~~~~~~~~~~~~
Expand Down
7 changes: 6 additions & 1 deletion mapreader/annotate/annotator.py
Original file line number Diff line number Diff line change
Expand Up @@ -512,7 +512,12 @@ def check_eligibility(row):
queue_df["eligible"] = queue_df.apply(check_eligibility, axis=1)

if self._sortby is not None:
queue_df.sort_values(self._sortby, ascending=self._ascending, inplace=True)
queue_df.sort_values(
by=[self._sortby, "min_y"],
ascending=[self._ascending, True],
kind="mergesort",
inplace=True,
)
queue_df = queue_df[queue_df.eligible]
else:
queue_df = queue_df[queue_df.eligible].sample(frac=1) # shuffle
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@
"folium>=0.12,<1.0.0",
"mapclassify>=2.0.0,<3.0.0",
"xyzservices==2024.9.0",
# "piffle @ git+https://github.com/rwood-97/piffle.git@iiif_dataclasses",
"piffle>=0.7.0",
"lxml",
],
extras_require={
Expand Down
50 changes: 25 additions & 25 deletions tests/test_classify/test_classifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -161,31 +161,31 @@ def test_init_resnet18_timm(inputs):
assert classifier.dataloaders == {}


@pytest.mark.dependency(name="timm_models", scope="session")
def test_init_models_timm(inputs):
annots, dataloaders = inputs
for model2test in [
["resnest50d_4s2x40d", timm.models.ResNet],
["resnest101e", timm.models.ResNet],
["resnext101_32x8d.fb_swsl_ig1b_ft_in1k", timm.models.ResNet],
["resnet152", timm.models.ResNet],
["tf_efficientnet_b3.ns_jft_in1k", timm.models.EfficientNet],
["swin_base_patch4_window7_224", timm.models.swin_transformer.SwinTransformer],
["vit_base_patch16_224", timm.models.vision_transformer.VisionTransformer],
]: # these are models from 2021 paper
model, model_type = model2test
my_model = timm.create_model(
model, pretrained=True, num_classes=len(annots.labels_map)
)
assert isinstance(my_model, model_type)
classifier = ClassifierContainer(
my_model, labels_map=annots.labels_map, dataloaders=dataloaders
)
assert isinstance(classifier.model, model_type)
assert all(k in classifier.dataloaders.keys() for k in ["train", "test", "val"])
classifier = ClassifierContainer(my_model, labels_map=annots.labels_map)
assert isinstance(classifier.model, model_type)
assert classifier.dataloaders == {}
# @pytest.mark.dependency(name="timm_models", scope="session")
# def test_init_models_timm(inputs):
# annots, dataloaders = inputs
# for model2test in [
# ["resnest50d_4s2x40d", timm.models.ResNet],
# ["resnest101e", timm.models.ResNet],
# ["resnext101_32x8d.fb_swsl_ig1b_ft_in1k", timm.models.ResNet],
# ["resnet152", timm.models.ResNet],
# ["tf_efficientnet_b3.ns_jft_in1k", timm.models.EfficientNet],
# ["swin_base_patch4_window7_224", timm.models.swin_transformer.SwinTransformer],
# ["vit_base_patch16_224", timm.models.vision_transformer.VisionTransformer],
# ]: # these are models from 2021 paper
# model, model_type = model2test
# my_model = timm.create_model(
# model, pretrained=True, num_classes=len(annots.labels_map)
# )
# assert isinstance(my_model, model_type)
# classifier = ClassifierContainer(
# my_model, labels_map=annots.labels_map, dataloaders=dataloaders
# )
# assert isinstance(classifier.model, model_type)
# assert all(k in classifier.dataloaders.keys() for k in ["train", "test", "val"])
# classifier = ClassifierContainer(my_model, labels_map=annots.labels_map)
# assert isinstance(classifier.model, model_type)
# assert classifier.dataloaders == {}


# test loading object from pickle file
Expand Down