-
Notifications
You must be signed in to change notification settings - Fork 179
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: trigger doc build all condition #2242
base: main
Are you sure you want to change the base?
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Flags with carried forward coverage won't be shown. Click here to find out more. |
Would the problem that this PR addresses get fixed as part of this other PR? |
…nd sklearn doc build
…d circular import
Hi David, |
c40e68d
to
b030f59
Compare
.ci/pipeline/docs.yml
Outdated
@@ -66,16 +66,23 @@ jobs: | |||
- script: | | |||
export PREFIX=$(dirname $(dirname $(which python))) | |||
export DALROOT=$PREFIX | |||
export LD_LIBRARY_PATH=$(dirname $(dirname $(which python)))/lib:$LD_LIBRARY_PATH |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sklearnex has an option to add oneDAL's path to its RPATH which could be used here - see e.g. here:
export DALRPATH=--abs-rpath |
Line 43 in 6742689
USE_ABS_RPATH = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi David,
I think by setting DALROOT, and since this pipeline is not running in conda, in build.sh will set DALRPATH automatically by entering the elif statement since "${DALROOT}" != "${CONDA_PREFIX}"
Best,
Yue
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It will set it in the call to build.sh
, but not in the call to setup.py build_ext
.
@yuejiaointel @icfaust Is there some way that the generated HTMLs could be linked as a clickable and rendable link from the "Docs" CI that we see here at the bottom of PRs without having to manually download them from AzureDevOps? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some initial thoughts, I've never looked at this pipeline before, and seeing a daal-devel
install makes me worried how outdated these scripts are.
.ci/pipeline/docs.yml
Outdated
@@ -48,13 +47,14 @@ jobs: | |||
vmImage: 'ubuntu-22.04' | |||
steps: | |||
- script: | | |||
source /opt/intel/oneapi/compiler/latest/env/vars.sh |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi Ethan,
Thx reviewing this PR! I removed this this source. I was using it to ensure tbb.so file but it was already in the daal-devel dependency
Best,
Yue
.ci/pipeline/docs.yml
Outdated
bash .ci/scripts/describe_system.sh | ||
displayName: 'System info' | ||
- task: UsePythonVersion@0 | ||
inputs: | ||
versionSpec: '3.11' | ||
addToPath: true | ||
- script: sudo apt-get update && sudo apt-get install -y clang-format | ||
- script: sudo apt-get update && sudo apt-get install -y clang-format && sudo apt-get install -y pandoc |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- script: sudo apt-get update && sudo apt-get install -y clang-format && sudo apt-get install -y pandoc | |
- script: sudo apt-get update && sudo apt-get install -y clang-format pandoc |
apt-get install can be merged to one call.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi Ethan,
Good catch! I merged the apt get command
Best,
Yue
.ci/pipeline/docs.yml
Outdated
export LD_LIBRARY_PATH=$(dirname $(dirname $(which python)))/lib:$LD_LIBRARY_PATH | ||
./conda-recipe/build.sh | ||
cp ${DALROOT}/lib/python3.11/site-packages/daal4py/_daal4py.cpython*.so daal4py/ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
export LD_LIBRARY_PATH=$(dirname $(dirname $(which python)))/lib:$LD_LIBRARY_PATH | |
./conda-recipe/build.sh | |
cp ${DALROOT}/lib/python3.11/site-packages/daal4py/_daal4py.cpython*.so daal4py/ | |
NO_DPC=1 | |
python setup.py build_ext --inplace |
If I understand correctly, I think you want the _daal4py.so just so you can import the modules of the repo in place. You can just build the extension, which will suffice. Do you also need to import onedal
for the docs?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think NO_DPC
should not be necessary if the compiler doesn't support NO_DPC
, but NO_DIST=1
might be required.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for build_ext that is clear that NO_DPC isn't necessary, if we want to build onedal pybind11, and we are using the intel compiler, it may save some time. Will wait on @yuejiaointel to clarify what all is necessary.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note that build_ext
will only build daal4py
. For sklearnex
you need build
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi Ethan Hi David!
Thx for this good information, setyp.py build_ext --inplace can build daal4py but still need build.sh for slearnex doc to be built
What is removed:
- step of setting LD_LIBRARY_PATH in build daal4py/sklearnex stage
- step of cp the build file back in the daal4py folder
What is needed:
- NO_DIST=1 otherwise build.sh will complain MPIROOT not being set.
- ./conda-recipe/build.sh otherwise sklearnex doc will give module not found warnings
- LD_LIBRARY_PATH in the 2 build doc stages, pipeline will gives the module not found warning if not set.
Given this I am not sure if it is better to run setup.py twice (once in place and once by running build.sh) or do the cp command. For now I'll have setup.py run twice here. I also noticed the last step of python setup.py build_ext --inplace is copy daal4py .so file from lib back to daal4py folder.
Best,
Yue
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@yuejiaointel A call to python setup.py build
also builds the module in-place. One is for daal4py
and the other for onedal
+sklearnex
. They are built using different frameworks which follow different setup commands.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also for paths involving python modules, please use PYTHONPATH
if needed to make it more obvious that it's about python things.
@yuejiaointel I'm seeing some missing things in the docs generated here. Particularly, autoclass is missing the constructors of And this is how it renders in the generated artifact: I have the same issue when building the docs locally so not sure if it's related to this PR. |
./conda-recipe/build.sh | ||
displayName: 'Build daal4py/sklearnex' | ||
- script: | | ||
export LD_LIBRARY_PATH=$(dirname $(dirname $(which python)))/lib:$LD_LIBRARY_PATH | ||
cd doc/daal4py | ||
make html | ||
make html 2>&1 | tee build.log |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add this file to gitignore.
@@ -25,8 +25,16 @@ mkdir $SAMPLES_DIR | |||
|
|||
# copy jupyter notebooks | |||
cd .. | |||
rsync -a --exclude='doc/$SAMPLES_DIR/daal4py_data_science.ipynb' examples/notebooks/*.ipynb doc/$SAMPLES_DIR |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why would this change be needed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi David,
The exclude command does not recognize a path it just read as a string literal, so I changed to just file name. If we don't exclude this file when cp from examples, sklearnex doc will give a warning that this file is not included in any of the rst files
Best,
Yue
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's curious - it does work for me when I try it locally. It also appears to show up like that in stackexchange posts:
https://askubuntu.com/questions/1420321/how-to-exclude-the-multiple-directory-in-rsync
.. but it looks like just pattern matching should also work. Would it is still work if you prefix it with .*
?
doc/daal4py/Makefile
Outdated
@@ -33,4 +33,4 @@ help: | |||
# Catch-all target: route all unknown targets to Sphinx using the new | |||
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). | |||
%: Makefile | |||
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) | |||
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Better to leave the empty line at the end here.
.ci/pipeline/docs.yml
Outdated
python setup.py build_ext --inplace | ||
./conda-recipe/build.sh |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's a call to both build_ext
and conda recipe's build. What's the idea for having both? That'd leave you with one importable module (daal4py
) being in two places, with one of them having oneDAL in the rpath and the other without.
If you want to build both extensions in-place, you should replace the call to build.sh
with this:
python setup.py build --abs-rpath
If that doesn't work, perhaps would require these two:
python setup.py build_ext --inplace --abs-rpath
python setup.py build --abs-rpath
doc/build-doc.sh
Outdated
cat build.log | ||
|
||
# check for autodoc warnings | ||
if grep "WARNING" build.log; then |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On a second look, maybe it'd be easier to follow if having both of these checks (daal4py/sklearnex) in the .yaml files.
doc/build-doc.sh
Outdated
# check for autodoc warnings | ||
if grep "WARNING" build.log; then | ||
echo "Warnings detected, build failed!" | ||
exit 1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This block should not be needed if you are passing option -W
to sphinx. Also please leave the empty lines at the end of files.
Description
in this PR I fixed several problems so now doc can build without autodoc warnings, also I added code so in future if autodoc gives a warning it will be marked as an error so pipeline will fail. It still has duplicate warnings because we are having multiple same classes def in daal4py.
_List associated issue number(s) if exist(s): #8786
PR should start as a draft, then move to ready for review state after CI is passed and all applicable checkboxes are closed.
This approach ensures that reviewers don't spend extra time asking for regular requirements.
You can remove a checkbox as not applicable only if it doesn't relate to this PR in any way.
For example, PR with docs update doesn't require checkboxes for performance while PR with any change in actual code should have checkboxes and justify how this code change is expected to affect performance (or justification should be self-evident).
Checklist to comply with before moving PR from draft:
PR completeness and readability
Testing
Performance