You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/ROS.rst
+15-15Lines changed: 15 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,9 +2,9 @@
2
2
The UAVF ROS Package
3
3
********************
4
4
5
-
The ``uavf`` package in ``main`` is a python package that does not depend on ROS. We do use ROS to orchestrate the mission, so we have a ROS package as well.
5
+
The ``uavfpy`` package in ``main`` is a python package that does not depend on ROS. We do use ROS to orchestrate the mission, so we have a ROS package as well.
6
6
7
-
``uavf`` is the name of the ROS package. Its development shares an issue tracker and repository with the main python package, but its development happens on the ``ROS`` and ``ROS-dev`` branches of the repository.
7
+
``uavfros`` is the name of the ROS package. Its development shares an issue tracker and repository with the main python package, but its development happens on the ``ROS`` and ``ROS-dev`` branches of the repository.
8
8
9
9
Installation
10
10
============
@@ -13,7 +13,7 @@ Prerequisites
13
13
`````````````
14
14
In order to develop packages on ROS, you need a PC equipped with Linux. Any desktop linux platform is suitable, but the easiest by far is Ubuntu. I prefer Ubuntu MATE on the desktop, but you can use a standard Ubuntu, KDE, or whichever flavor you like.
15
15
16
-
``uavf`` is a ROS package. To install it, you need to have ROS installed and configured. That will not be covered in this documentation; if you are brand new to ROS, I recommend that you go through the ROS tutorial [1]_ before continuing to the next section.
16
+
``uavfros`` is a ROS package. To install it, you need to have ROS installed and configured. That will not be covered in this documentation; if you are brand new to ROS, I recommend that you go through the ROS tutorial [1]_ before continuing to the next section.
17
17
18
18
.. note::
19
19
@@ -75,7 +75,7 @@ You will see several new nodes:
75
75
This is the simulated PX4, which can be commanded with MAVROS.
76
76
77
77
Install ``uavf``
78
-
````````````````
78
+
`````````````````
79
79
.. warning::
80
80
81
81
Because we are using this package from ROS, we need to ensure that we are NOT in any python virtual environment. You can verify this by typing ``which python`` into a terminal window. Make sure that the output is ``/usr/bin/python``.
@@ -115,38 +115,38 @@ Run ``catkin_make`` and source your ``devel/setup.bash`` file:
115
115
116
116
Make sure you remember to start a ``roscore`` instance in a separate terminal window.
117
117
118
-
Running a Mission with ``uavf``
119
-
===============================
118
+
Running a Mission with ``uavfros``
119
+
==================================
120
120
121
-
Until we have viable hardware testing, this section deals with running a simulated mission with ``uavf``.
121
+
Until we have viable hardware testing, this section deals with running a simulated mission with ``uavfros``.
122
122
123
-
Run ``uavf`` Interop
124
-
--------------------
123
+
Run ``uavfros`` Interop
124
+
-----------------------
125
125
126
126
The interop client is a ros node written in Python. We start it with ``rosrun``.
127
127
128
128
.. code-block::
129
129
130
-
rosrun uavf interop
130
+
rosrun uavfros interop
131
131
132
-
Run ``uavf`` Planner
133
-
--------------------
132
+
Run ``uavfros`` Planner
133
+
-----------------------
134
134
135
135
The navigation node is a ros service node that will generate a new path for the UAV to follow between waypoints.
136
136
137
137
.. code-block:: bash
138
138
139
-
rosrun uavf planner
139
+
rosrun uavfros planner
140
140
141
141
142
-
Run ``uavf`` GNC
142
+
Run ``uavfros`` GNC
143
143
----------------
144
144
145
145
The uavf GNC node is a ros node that will take a computed plan and manage the execution of the plan on the UAV.
Copy file name to clipboardExpand all lines: docs/contributing.rst
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -231,7 +231,7 @@ We also keep ROS code separate because `it's good practice to do so anyway <http
231
231
:width:50%
232
232
:align:center
233
233
234
-
So, to avoid development hell, we put the bulk of the functionality into the ``main`` branch, install ``main`` package (and all of its dependencies) onto the vehicle's system python, and then we can just import the ``uavf`` package and use its functionality in our ROS scripts.
234
+
So, to avoid development hell, we put the bulk of the functionality into the ``main`` branch, install ``main`` package (and all of its dependencies) onto the vehicle's system python, and then we can just import the :py:package:`uavfpy` package and use its functionality in our ROS scripts.
235
235
236
236
The Golden Rule of ROS Development
237
237
``````````````````````````````````
@@ -310,7 +310,7 @@ Then, call it from the piece of code in the ``ROS`` branch.:
310
310
311
311
.. code-block:: python
312
312
313
-
from Pipeline import pipeline
313
+
fromuavfpy.Pipeline import pipeline
314
314
315
315
defpublish_pixels(pipeline, image):
316
316
pixels = pipeline.count_pixels(image)
@@ -342,7 +342,7 @@ Documentation
342
342
343
343
We have attempted to make writing documentation as easy as possible -- and as close to the codebase as possible! This documentation contains documentation that people have written manually (such as this guide). This manual documentation is written in a format called reStructuredText, which is a commonly-used format for software documentation. To get started writing manual documentation with reStructuredText, read the `reStructuredText Primer <https://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html>`_.
344
344
345
-
The second type of documentation is the auto-generated documentation. This documentation is generated from in-line comments in the codebase. You don't need to touch anything in the `docs/` folder to write this documentation -- just comment your code, and your comments are added to the API page (:py:mod:`odcl`) automatically. The API page will rebuild itself automatically whenever pushes are made to the ``main`` branch of the repository.
345
+
The second type of documentation is the auto-generated documentation. This documentation is generated from in-line comments in the codebase. You don't need to touch anything in the `docs/` folder to write this documentation -- just comment your code, and your comments are added to the API page (:py:package:`uavfpy`) automatically. The API page will rebuild itself automatically whenever pushes are made to the ``main`` branch of the repository.
346
346
347
347
We use `Sphinx <https://www.sphinx-doc.org/en/master/index.html>`_ and a tool called `Sphinx Autoapi <https://github.com/readthedocs/sphinx-autoapi>`_ to automatically generate descriptions and API documentation for any class or method with a numpy-formatted docstring. This tool automatically parses the codebase.
Copy file name to clipboardExpand all lines: docs/getting_started.rst
+14-14Lines changed: 14 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,18 +2,18 @@
2
2
Getting Started
3
3
***************
4
4
5
-
This page is a guide on how get started with the ``uavf`` API.
5
+
This page is a guide on how get started with the :py:package:`uavfpy` API.
6
6
7
7
Prerequisites
8
8
=============
9
9
10
-
We have tested the API under linux and MacOS. Development of ``uavf`` is possible under Windows also.
10
+
We have tested the API under linux and MacOS. Development of :py:package:`uavfpy` is possible under Windows also.
11
11
12
12
Our release targets Python 3.8.
13
13
14
14
.. note::
15
15
16
-
``odcl`` uses the tflite runtime for inference. You can perform inference on the CPU, but this can be very slow. The vehicle uses the `Coral Edge TPU <https://www.coral.ai/docs/>`_ for on-board acceleration of inferencing.
16
+
:py:mod:`uavfpy.odcl` uses the tflite runtime for inference. You can perform inference on the CPU, but this can be very slow. The vehicle uses the `Coral Edge TPU <https://www.coral.ai/docs/>`_ for on-board acceleration of inferencing.
17
17
18
18
The Coral Edge TPU is an ASIC developed by Google specifically designed for accelerating deep learning. If you do not have access to an Edge TPU, you can use the CPU for inference.
19
19
@@ -51,10 +51,10 @@ First, we import necessary modules:
51
51
.. code-block:: python
52
52
53
53
# import classes
54
-
from odcl.inference import TargetInterpreter, Tiler
The :py:class:`odcl.inference.TargetInterpreter` class handles inputs and outputs to the neural network for object detection. We give it paths to the model and labels, tell it whether to run on CPU or TPU, and set the threshold for detection.
74
+
The :py:class:`uavfpy.odcl.inference.TargetInterpreter` class handles inputs and outputs to the neural network for object detection. We give it paths to the model and labels, tell it whether to run on CPU or TPU, and set the threshold for detection.
75
75
76
-
Instantiating a :py:class:`odcl.inference.TargetInterpreter` object takes a while, so this object should be created outside of a loop if latency is at issue.
76
+
Instantiating a :py:class:`uavfpy.odcl.inference.TargetInterpreter` object takes a while, so this object should be created outside of a loop if latency is at issue.
77
77
78
78
.. code-block:: python
79
79
@@ -86,13 +86,13 @@ Instantiating a :py:class:`odcl.inference.TargetInterpreter` object takes a whil
86
86
order_key="efficientdetd0",
87
87
)
88
88
89
-
Next, we create the :py:class:`odcl.inference.Tiler`, which handles the tiling of the input image. We are dealing with inputs that are very large compared to the inputs of the neural network; the tiler will decompose the image into overlapping tiles, feed the NN, and then parse NN outputs from the respective tiles back into the raw image.
89
+
Next, we create the :py:class:`uavfpy.odcl.inference.Tiler`, which handles the tiling of the input image. We are dealing with inputs that are very large compared to the inputs of the neural network; the tiler will decompose the image into overlapping tiles, feed the NN, and then parse NN outputs from the respective tiles back into the raw image.
90
90
91
-
:py:class:`odcl.color.Color` is a class used to extract color information from found targets. For now, it does not take any arguments.
91
+
:py:class:`uavfpy.odcl.color.Color` is a class used to extract color information from found targets. For now, it does not take any arguments.
92
92
93
-
:py:class:`odcl.utils.drawer.TargetDrawer` is a utility class used to draw bounding boxes. Passing it as an argument will draw bounding boxes on the raw image and store the result into the :py:class:`Pipeline`'s :py:attr:`drawn` attribute. Passing it will also open a window to display targets that were found, along with the shape color-mask. Therefore, it is useful for evaluating the performance of the pipeline in real time.
93
+
:py:class:`uavfpy.odcl.utils.drawer.TargetDrawer` is a utility class used to draw bounding boxes. Passing it as an argument will draw bounding boxes on the raw image and store the result into the :py:class:`Pipeline`'s :py:attr:`drawn` attribute. Passing it will also open a window to display targets that were found, along with the shape color-mask. Therefore, it is useful for evaluating the performance of the pipeline in real time.
94
94
95
-
If a :py:class:`TargetDrawer` is not passed to the :py:class:`odcl.pipeline.Pipeline` constructor, the :py:class:`Pipeline` will not draw bounding boxes on the image, nor will found targets be displayed.
95
+
If a :py:class:`TargetDrawer` is not passed to the :py:class:`uavfpy.odcl.pipeline.Pipeline` constructor, the :py:class:`Pipeline` will not draw bounding boxes on the image, nor will found targets be displayed.
96
96
97
97
.. code-block:: python
98
98
@@ -108,7 +108,7 @@ If a :py:class:`TargetDrawer` is not passed to the :py:class:`odcl.pipeline.Pipe
Copy file name to clipboardExpand all lines: docs/overview.rst
+8-8Lines changed: 8 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,11 +12,11 @@ Purpose of This Software
12
12
13
13
This package is a collection of tools for UCI's competition team at the `AUVSI SUAS <https://www.auvsi-suas.org/>`_. The AUVSI SUAS is a student competition in which an Autonomous Aerial System navigates through waypoints, avoids other vehicles and static obstacles, identifies and submits objects on the ground, and performs mapping tasks.
14
14
15
-
This package contains python modules for:
15
+
This :py:package:`uavfpy` contains python modules for:
16
16
17
-
* Autonomous Navigation (``planner``)
18
-
* Object Detection, Classification, and Localization (``odcl``)
19
-
* Interoperability with the AUVSI SUAS (``interop``)
* Object Detection, Classification, and Localization (:py:mod:`uavfpy.odcl`)
19
+
* Interoperability with the AUVSI SUAS
20
20
21
21
This package is intended to be deployed both on the vehicle and on the ground station. To orchestrate the mission and manage communications between the vehicle and the ground, we use `ROS Noetic <http://wiki.ros.org/noetic>`_.
22
22
@@ -63,8 +63,8 @@ Repository Structure
63
63
64
64
Therefore, there are two branches in the repository:
65
65
66
-
* ``main`` -- contains the python package
67
-
* ``ROS`` -- contains the ROS package
66
+
* ``main`` -- contains the ``uavf`` python package
67
+
* ``ROS`` -- contains the ``rosuavf`` ROS package
68
68
69
69
The development of these two branches are kept *entirely separate*.
70
70
@@ -74,7 +74,7 @@ To install the package to system python (e.g., on board the UAV or on the Ground
0 commit comments