Merge branch 'main' into task/1373
|
@ -60,13 +60,14 @@ runs:
|
|||
cat environment.yml
|
||||
|
||||
- name: Setup conda environment
|
||||
uses: mamba-org/provision-with-micromamba@main
|
||||
uses: mamba-org/setup-micromamba@v1
|
||||
with:
|
||||
environment-file: environment.yml
|
||||
environment-name: env
|
||||
channels: conda-forge
|
||||
cache-env: true
|
||||
cache-env-key: ${{ runner.os }}${{ runner.arch }}-${{ env.WEEK }}-${{ hashFiles('environment.yml') }}
|
||||
init-shell: bash
|
||||
cache-environment: true
|
||||
cache-environment-key: ${{ runner.os }}${{ runner.arch }}-${{ env.WEEK }}-${{ hashFiles('environment.yml') }}
|
||||
|
||||
- name: List conda environment
|
||||
shell: bash -l {0}
|
||||
|
|
|
@ -4,4 +4,4 @@ Fixes #
|
|||
## Checklist
|
||||
- [ ] Updated HISTORY.rst and link to any relevant issue (if these changes are user-facing)
|
||||
- [ ] Updated the user's guide (if needed)
|
||||
- [ ] Tested the affected models' UIs (if relevant)
|
||||
- [ ] Tested the Workbench UI (if relevant)
|
||||
|
|
|
@ -46,6 +46,25 @@ jobs:
|
|||
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
|
||||
python -m flake8 src --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
|
||||
|
||||
check-history-rst-syntax:
|
||||
name: Check HISTORY RST syntax
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
|
||||
- uses: actions/setup-python@v4
|
||||
name: Set up python
|
||||
with:
|
||||
python-version: ${{ env.LATEST_SUPPORTED_PYTHON_VERSION }}
|
||||
|
||||
- name: Set up environment
|
||||
run: pip install doc8
|
||||
|
||||
- name: Lint with doc8
|
||||
run: |
|
||||
# Skip line-too-long errors (D001)
|
||||
python -m doc8 --ignore D001 HISTORY.rst
|
||||
|
||||
run-model-tests:
|
||||
name: Run model tests
|
||||
runs-on: ${{ matrix.os }}
|
||||
|
@ -317,7 +336,7 @@ jobs:
|
|||
run: make userguide
|
||||
|
||||
- name: Build binaries
|
||||
run: make CONDA=micromamba binaries
|
||||
run: make CONDA="$MAMBA_EXE" binaries
|
||||
|
||||
- name: Run invest-autotest with binaries
|
||||
if : |
|
||||
|
|
105
HISTORY.rst
|
@ -41,6 +41,12 @@ Unreleased Changes
|
|||
* Fixed a bug in the CLI where ``invest getspec --json`` failed on
|
||||
non-json-serializable objects such as ``pint.Unit``.
|
||||
https://github.com/natcap/invest/issues/1280
|
||||
* A new directory at `./doc/decision-records` has been created for
|
||||
"Architecture/Any Decision Records", which will serve as a record of
|
||||
nontrivial decisions that were made to InVEST and why. This is
|
||||
intended for reference by our science and software teams, and also by
|
||||
the community at large when inquiring about a nontrivial change.
|
||||
https://github.com/natcap/invest/issues/1079
|
||||
* Updated the package installation instructions in the API docs for clarity
|
||||
and also to highlight the ease of installation through ``conda-forge``.
|
||||
https://github.com/natcap/invest/issues/1256
|
||||
|
@ -52,6 +58,13 @@ Unreleased Changes
|
|||
orthogonal. Now no models call ``validate`` from ``execute``. This
|
||||
affected AWY, CV, UFRM, Wave Energy, and Wind Energy.
|
||||
(`#1373 <https://github.com/natcap/invest/issues/1373>`_)
|
||||
* Improved the validation message that is returned when not all spatial
|
||||
inputs overlap (`#502 <https://github.com/natcap/invest/issues/502>`_)
|
||||
* Standardized the name and location of the taskgraph cache directory for
|
||||
all models. It is now called ``taskgraph_cache`` and located in the top
|
||||
level of the workspace directory.
|
||||
(`#1230 <https://github.com/natcap/invest/issues/1230>`_)
|
||||
* InVEST is now distributed under the Apache 2.0 License.
|
||||
* Workbench
|
||||
* Fixed a bug where sampledata downloads failed silently (and progress bar
|
||||
became innacurate) if the Workbench did not have write permission to
|
||||
|
@ -69,6 +82,12 @@ Unreleased Changes
|
|||
* Middle clicking an InVEST model tab was opening a blank window. Now
|
||||
middle clicking will close that tab as expected.
|
||||
(`#1261 <https://github.com/natcap/invest/issues/1261>`_)
|
||||
* Updated InVEST logo to use new version with registered trademark symbol.
|
||||
(https://naturalcapitalproject.stanford.edu/invest-trademark-and-logo-use-policy)
|
||||
* Coastal Blue Carbon
|
||||
* Added validation for the transition table, raising a validation error if
|
||||
unexpected values are encountered.
|
||||
(`#729 <https://github.com/natcap/invest/issues/729>`_)
|
||||
* Forest Carbon
|
||||
* The biophysical table is now case-insensitive.
|
||||
* HRA
|
||||
|
@ -76,7 +95,19 @@ Unreleased Changes
|
|||
consequence criteria were skipped for a single habitat. The model now
|
||||
correctly handles this case. https://github.com/natcap/invest/issues/1250
|
||||
* Tables in the .xls format are no longer supported. This format was
|
||||
deprecated by ``pandas``. (`#1271 <https://github.com/natcap/invest/issues/1271>`_)
|
||||
deprecated by ``pandas``.
|
||||
(`#1271 <https://github.com/natcap/invest/issues/1271>`_)
|
||||
* Fixed a bug where vector inputs could be rasterized onto a grid that is
|
||||
not exactly aligned with other raster inputs.
|
||||
(`#1312 <https://github.com/natcap/invest/issues/1312>`_)
|
||||
* Dropped support for Excel (.xlsx) files
|
||||
(`#1391 <https://github.com/natcap/invest/issues/1391>`_)
|
||||
* NDR
|
||||
* The contents of the output ``cache_dir`` have been consolidated into
|
||||
``intermediate_outputs``.
|
||||
* Fixed a bug where results were calculated incorrectly if the runoff proxy
|
||||
raster (or the DEM or LULC) had no nodata value
|
||||
(`#1005 <https://github.com/natcap/invest/issues/1005>`_)
|
||||
* Pollination
|
||||
* Several exceptions have been tidied up so that only fieldnames are
|
||||
printed instead of the python data structures representing the whole
|
||||
|
@ -102,6 +133,20 @@ Unreleased Changes
|
|||
* Fixed an issue with sediment deposition progress logging that was
|
||||
causing the "percent complete" indicator to not progress linearly.
|
||||
https://github.com/natcap/invest/issues/1262
|
||||
* The contents of the output ``churn_dir_not_for_humans`` have been
|
||||
consolidated into ``intermediate_outputs``.
|
||||
* We implemented two major functional changes to the InVEST LS Factor
|
||||
that significantly affect most outputs of SDR and will bring the LS
|
||||
factor output more in line with the outputs of SAGA-GIS's LS Factor.
|
||||
A discussion of differences between these two implementations can be
|
||||
viewed at https://github.com/natcap/invest/tree/main/doc/decision-records/ADR-0001-Update-SDR-LS-Factor.md.
|
||||
The two specific changes implemented are:
|
||||
|
||||
* The LS Factor's on-pixel aspect length is now calculated as
|
||||
``abs(sin(slope)) + abs(cos(slope))``.
|
||||
* The LS Factor's upstream contributing area is now calculated as
|
||||
an estimate for the specific catchment area, calculated by
|
||||
``sqrt(n_pixels_upstream * pixel_area)``.
|
||||
* Seasonal Water Yield
|
||||
* Fixed a bug where monthy quickflow nodata pixels were not being passed
|
||||
on to the total quickflow raster, which could result in negative values
|
||||
|
@ -121,14 +166,68 @@ Unreleased Changes
|
|||
where s_i / a_im > 100. This is done to avoid overflow errors when
|
||||
calculating edge cases where the result would round down to 0 anyway.
|
||||
(`#1318 <https://github.com/natcap/invest/issues/1318>`_)
|
||||
* The contents of the output ``cache_dir`` have been consolidated into
|
||||
``intermediate_outputs``.
|
||||
* Urban Flood Risk
|
||||
* Fixed a bug where the model incorrectly raised an error if the
|
||||
biophysical table contained a row of all 0s.
|
||||
(`#1123 <https://github.com/natcap/invest/issues/1123>`_)
|
||||
* The contents of the output ``temp_working_dir_not_for_humans`` have been
|
||||
consolidated into ``intermediate_files``.
|
||||
* Biophysical table Workbench validation now warns if there is a missing
|
||||
curve number value.
|
||||
(`#1346 <https://github.com/natcap/invest/issues/1346>`_)
|
||||
* Urban Nature Access
|
||||
* Urban nature supply outputs have been renamed to add ``percapita`` to the
|
||||
filename.
|
||||
|
||||
* In uniform search radius mode, ``urban_nature_supply.tif`` has been
|
||||
renamed to ``urban_nature_supply_percapita.tif``.
|
||||
* When defining search radii by urban nature class,
|
||||
``urban_nature_supply_lucode_[LUCODE].tif`` has been renamed to
|
||||
``urban_nature_supply_percapita_lucode_[LUCODE].tif``.
|
||||
* When defining search radii by population groups,
|
||||
``urban_nature_supply_to_[POP_GROUP].tif`` has been renamed to
|
||||
``urban_nature_supply_percapita_to_[POP_GROUP].tif``.
|
||||
|
||||
* A new output for "Accessible Urban Nature" is created, indicating the
|
||||
area of accessible greenspace available to people within the search
|
||||
radius, weighted by the selected decay function. The outputs vary
|
||||
slightly depending on the selected execution mode.
|
||||
|
||||
* In uniform search radius mode, a single new output is created,
|
||||
``accessible_urban_nature.tif``.
|
||||
* When defining search radii by urban nature class, one new
|
||||
output raster is created for each class of urban nature. These files
|
||||
are named ``accessible_urban_nature_lucode_[LUCODE].tif``.
|
||||
* When defining search radii for population groups, one new output
|
||||
raster is created for each population group. These files are named
|
||||
``accessible_urban_nature_to_[POP_GROUP].tif``.
|
||||
|
||||
* Urban nature classes can now be defined to occupy a proportion of a
|
||||
pixel, such as a park that is semi-developed. This proportion is
|
||||
provided through user input as a proportion (0-1) in the
|
||||
``urban_nature`` column of the LULC Attribute Table. A value of ``0``
|
||||
indicates that there is no urban nature in this class, ``0.333``
|
||||
indicates that a third of the area of this LULC class is urban nature,
|
||||
and ``1`` would indicate that the entire LULC class's area is urban
|
||||
nature. https://github.com/natcap/invest/issues/1180
|
||||
* Fixed an issue where, under certain circumstances, the model would raise
|
||||
a cryptic ``TypeError`` when creating the summary vector.
|
||||
https://github.com/natcap/invest/issues/1350
|
||||
* Visitation: Recreation and Tourism
|
||||
* Fixed a bug where overlapping predictor polygons would be double-counted
|
||||
in ``polygon_area_coverage`` and ``polygon_percent_coverage`` calculations.
|
||||
(`#1310 <https://github.com/natcap/invest/issues/1310>`_)
|
||||
in ``polygon_area_coverage`` and ``polygon_percent_coverage``
|
||||
calculations. (`#1310 <https://github.com/natcap/invest/issues/1310>`_)
|
||||
* Changed the calculation of ``point_nearest_distance`` metric to match
|
||||
the description in the User's Guide. Values are now the distance to the
|
||||
centroid of the AOI polygon instead of the distance to the nearest
|
||||
edge of the AOI polygon.
|
||||
(`#1347 <https://github.com/natcap/invest/issues/1347>`_)
|
||||
* Wind Energy
|
||||
* Updated a misleading error message that is raised when the AOI does
|
||||
not spatially overlap another input.
|
||||
(`#1054 <https://github.com/natcap/invest/issues/1054>`_)
|
||||
|
||||
3.13.0 (2023-03-17)
|
||||
-------------------
|
||||
|
|
224
LICENSE.txt
|
@ -1,38 +1,202 @@
|
|||
In this license, "Natural Capital Project" is defined as the parties of
|
||||
Stanford University, The Nature Conservancy, World Wildlife Fund Inc.,
|
||||
and University of Minnesota.
|
||||
|
||||
This tool has an open license. All people are invited to use the tool
|
||||
under the following conditions and terms:
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
Copyright (c) 2022, Natural Capital Project
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
All rights reserved.
|
||||
1. Definitions.
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are
|
||||
met:
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
* Redistributions of source code must retain the above copyright
|
||||
notice, this list of conditions and the following disclaimer.
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
* Redistributions in binary form must reproduce the above copyright
|
||||
notice, this list of conditions and the following disclaimer in the
|
||||
documentation and/or other materials provided with the
|
||||
distribution.
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
* Neither the name of Natural Capital Project nor the names of
|
||||
its contributors may be used to endorse or promote products derived
|
||||
from this software without specific prior written permission.
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
|
|
6
Makefile
|
@ -2,15 +2,15 @@
|
|||
DATA_DIR := data
|
||||
GIT_SAMPLE_DATA_REPO := https://bitbucket.org/natcap/invest-sample-data.git
|
||||
GIT_SAMPLE_DATA_REPO_PATH := $(DATA_DIR)/invest-sample-data
|
||||
GIT_SAMPLE_DATA_REPO_REV := a58b9c7bdd8a31cab469ea919fe0ebf23a6c668e
|
||||
GIT_SAMPLE_DATA_REPO_REV := 2e7cd618c661ec3f3b2a3bddfd2ce7d4704abc05
|
||||
|
||||
GIT_TEST_DATA_REPO := https://bitbucket.org/natcap/invest-test-data.git
|
||||
GIT_TEST_DATA_REPO_PATH := $(DATA_DIR)/invest-test-data
|
||||
GIT_TEST_DATA_REPO_REV := 2749f8e984c9030ae30ab6d43e7075b7a2d27cf8
|
||||
GIT_TEST_DATA_REPO_REV := e7d32d65612f4f3578a4fb57824af4e297c65283
|
||||
|
||||
GIT_UG_REPO := https://github.com/natcap/invest.users-guide
|
||||
GIT_UG_REPO_PATH := doc/users-guide
|
||||
GIT_UG_REPO_REV := 51e2fa74e4a1d09c1ab43dcda681027a833232cd
|
||||
GIT_UG_REPO_REV := 51a061b036b74034dcea81c029dfd1a44921bc71
|
||||
|
||||
ENV = "./env"
|
||||
ifeq ($(OS),Windows_NT)
|
||||
|
|
|
@ -0,0 +1,10 @@
|
|||
The "Natural Capital Project" is defined as the parties of Stanford University,
|
||||
University of Minnesota, Chinese Academy of Sciences, The Nature Conservancy,
|
||||
World Wildlife Fund, Stockholm Resilience Centre and the Royal Swedish Academy
|
||||
of Sciences.
|
||||
InVEST® is a registered trademark of Stanford University. Stanford University
|
||||
owns and maintains the trademark, and cooperates with the Natural Capital
|
||||
Project to develop this policy and any trademark licensing. Visit
|
||||
naturalcapitalproject.stanford.edu/invest-trademark-and-logo-use-policy for the
|
||||
complete trademark and logo policy.
|
||||
Copyright (c) 2023, Natural Capital Project
|
|
@ -0,0 +1,94 @@
|
|||
# ADR-0001: Update the InVEST SDR LS Factor
|
||||
|
||||
Author: James
|
||||
|
||||
Science Lead: Rafa
|
||||
|
||||
## Context
|
||||
|
||||
Since we released the updated InVEST SDR model in InVEST 3.1.0, we have seen a
|
||||
common refrain of users and NatCap science staff noticing that the LS factor
|
||||
output of SDR did not produce realistic results and that the LS factor produced
|
||||
by SAGA was much more realistic. We have over the years made a couple of notable
|
||||
changes to the model and to the LS factor that have altered the output including:
|
||||
|
||||
1. The SDR model's underlying routing model was changed from d-infinity to MFD in 3.5.0
|
||||
2. The $x$ parameter was changed in InVEST 3.8.1 from the true on-pixel aspect
|
||||
$|\sin \theta|+|\cos \theta|$ (described in Zevenbergen & Thorne 1987 and repeated
|
||||
in Desmet & Govers 1996) to the weighted mean of proportional flow from the
|
||||
current pixel to its neighbors.
|
||||
3. A typo in a constant value in the LS factor was corrected in InVEST 3.9.1
|
||||
4. An `l_max` parameter was exposed to the user in InVEST 3.9.1
|
||||
|
||||
Despite these changes to the LS factor, we still received occasional reports
|
||||
describing unrealistic LS factor outputs from SDR and that SAGA's LS factor
|
||||
was much more realistic.
|
||||
|
||||
After diving into the SAGA source code, it turns out that there are several
|
||||
important differences between the two despite both using Desmet & Govers (1996)
|
||||
for their LS factor equations:
|
||||
|
||||
1. The contributing area $A_{i,j-in}$ is not strictly defined in Desmet &
|
||||
Govers (1996), it is only referred to as "the contributing area at the inlet
|
||||
of a grid cell with coordinates (i, j) (m^2)".
|
||||
InVEST assumes that "contributing area" is $area_{pixel} \cdot n\\_upstream\\_pixels$.
|
||||
SAGA refers to this as "specific catchment area" and allows the user to choose their
|
||||
specific catchment area equation, where the available options are
|
||||
"contour length simply as cell size", "contour length dependent on aspect", "square
|
||||
root of catchment area" and "effective flow length".
|
||||
2. SAGA uses on-pixel aspect, $|\sin \theta|+|\cos \theta|$, and does not consider
|
||||
flow direction derived from a routing model when calculating the LS factor.
|
||||
3. The length exponent $m$ differs between the implementations. In SAGA,
|
||||
$m = \beta / (1 + \beta)$. In InVEST, we have a discontinuous function where
|
||||
$m$ is dependent on the slope of the current pixel and described as "classical USLE"
|
||||
in the user's guide and discussed in Oliveira et al (2013).
|
||||
4. SAGA's flow accumulation function [`Get_Flow()`](https://github.com/saga-gis/saga-gis/blob/master/saga-gis/src/tools/terrain_analysis/ta_hydrology/Erosion_LS_Fields.cpp#L394)
|
||||
only considers a pixel downstream if and only if its elevation is strictly less
|
||||
than the current pixel's elevation, which implies that flow accumulation will
|
||||
not navigate plateaus. InVEST's flow accumulation handles plateaus well,
|
||||
which can lead to longer flow accumulation values on the same DEM.
|
||||
5. SAGA's flow accumulation function `Get_Flow()` uses D8, InVEST's flow
|
||||
accumulation uses MFD.
|
||||
|
||||
It is important to note that when evaluating differences between the SAGA and InVEST
|
||||
LS Factor implementations, it is _critical_ to use a hydrologically conditioned DEM such
|
||||
as conditioned by Wang & Liu so that we control for differences in output due
|
||||
to the presence of plateaus.
|
||||
|
||||
Once we finally understood these discrepancies, James implemented several of the
|
||||
contributing area functions available in SAGA to see what might be most comparable
|
||||
to the real world. Source code and a docker container for these experiments are
|
||||
available at
|
||||
https://github.com/phargogh/invest-ls-factor-vs-saga/blob/main/src/natcap/invest/sdr/sdr.py#L901.
|
||||
Some additional discussion and notes can be viewed in the related github issue:
|
||||
https://github.com/natcap/invest/issues/915.
|
||||
|
||||
## Decision
|
||||
|
||||
After inspecting the results, Rafa decided that we should make these changes to
|
||||
the LS Factor calculation:
|
||||
|
||||
1. We will revert to using the on-pixel aspect, $|\sin \theta|+|\cos \theta|$.
|
||||
This is in line with the published literature.
|
||||
2. We will convert the "contributing area" portion of the LS Factor to be
|
||||
$\sqrt{ n\\_upstream\\_pixels \cdot area\_{pixel} }$. Rafa's opinion on this
|
||||
is that the LS factor equations were designed for a 1-dimensional situation,
|
||||
so our specific catchment area number should reflect this.
|
||||
|
||||
## Status
|
||||
|
||||
## Consequences
|
||||
|
||||
Once implemented and released, the LS factor outputs of SDR will be
|
||||
significantly different, but they should more closely match reality.
|
||||
|
||||
We hope that there will be fewer support requests about this once the change is
|
||||
released.
|
||||
|
||||
## References
|
||||
|
||||
Zevenbergen & Thorne (1987): https://searchworks.stanford.edu/articles/edb__89861226
|
||||
|
||||
Desmet & Govers (1996): https://searchworks.stanford.edu/articles/edsgac__edsgac.A18832564
|
||||
|
||||
Oliveira et al (2013): http://dx.doi.org/10.5772/54439
|
|
@ -0,0 +1,60 @@
|
|||
# ADR-0002: Switch the InVEST License
|
||||
|
||||
Author: Doug
|
||||
|
||||
Science Lead: N/A
|
||||
|
||||
## Context
|
||||
|
||||
NatCap and Stanford OTL decided to trademark "InVEST" and register it with
|
||||
the USPTO. During those discussions Stanford OTL suggested that we revisit
|
||||
our 3-Clause BSD License and make sure it still met our needs. There was also
|
||||
concern about whether InVEST was being fairly attributed in derivative works
|
||||
and whether the InVEST License should play a role in helping define attribution.
|
||||
While reviewing various licenses the intention was to always stick with a
|
||||
permissive open source license and not move to a copyleft or more restrictive
|
||||
license. Doug did a thorough audit of licenses, specifically looking at how
|
||||
permissive open source licenses and software projects handle attribution.
|
||||
Doug collaborated with James, Lisa, and others throughout this process.
|
||||
|
||||
## Decision
|
||||
|
||||
After reviewing different possibilities Doug, with approval from Lisa and the
|
||||
Software Team, decided we should switch to the Apache 2.0 License. The Apache
|
||||
2.0 License provides the following benefits while remaining a permissive open
|
||||
source license:
|
||||
|
||||
1. Explicitly states the terms in clear language, including attribution and
|
||||
trademark guidelines, which helps remove ambiguity and room for
|
||||
interpretation.
|
||||
2. Includes a clause requiring derivative works to clearly note any major changes
|
||||
to the original source. This felt like a nice addition given the scientific
|
||||
nature of our software and quality that our software is known for.
|
||||
3. Is a widely established and adopted license that is useable "as is".
|
||||
|
||||
I will note that after many discussions about whether we could or should
|
||||
include more strict attribution requirements in the license the NatCap Stanford
|
||||
Leadership Team decided the license was not the place to address those issues.
|
||||
|
||||
Doug and James did have a discussion about whether we should reach out to prior
|
||||
contributors to get their sign off on switching licenses. Doug had noticed this
|
||||
was something other large open source projects had to contend with when
|
||||
switching licenses. However, because the license remains permissive and the
|
||||
fact that all major InVEST source code contributors mostly pushed changes as a
|
||||
NatCap Stanford University employee, we did not feel it necessary.
|
||||
|
||||
## Status
|
||||
|
||||
Complete. Released in InVEST 3.14.0.
|
||||
|
||||
## Consequences
|
||||
|
||||
This should have limited impact given the Apache 2.0 License is also a
|
||||
permissive license.
|
||||
|
||||
We will now distribute a NOTICE file alongside the LICENSE file that contains
|
||||
custom copyright language. Derivative works are required to also distribute
|
||||
this file alongside the Apache 2.0 License.
|
||||
|
||||
We hope that derivative works who make changes to InVEST source will note those
|
||||
in a reasonable way.
|
|
@ -0,0 +1,57 @@
|
|||
# ADR-0003: Revert Habitat Quality Decay Method
|
||||
|
||||
Author: Doug
|
||||
|
||||
Science Lead(s): Lisa, Stacie, Jade
|
||||
|
||||
## Context
|
||||
The Habitat Quality model has used convolutions as an implementation for
|
||||
decaying threat rasters over distance since InVEST 3.3.0. This approach
|
||||
strayed from the implementation described in the User's Guide and the User's
|
||||
Guide was never updated to reflect it. The User's Guide described decay using
|
||||
an euclidean distance implementation. When thinking about updating the User's
|
||||
Guide to reflect the convolution implementation it was not clear that the
|
||||
exponential decay via convolutions was providing the desired result. The
|
||||
justification for the convolution method was to better reflect the real world
|
||||
in how the density of a threat or surrounding threat pixels could have an even
|
||||
greater, cumulative impact and degradation over space. However, the degradation
|
||||
raster produced from the filtered threat rasters did not make intuitive sense.
|
||||
|
||||
Stacie noted that users via the Forum were reporting HQ outputs that were all
|
||||
at the very high end of the 0-1 range and this didn't happen prior to the
|
||||
convolution implementation. The degradation outputs were all very low too. I
|
||||
believe the reason these values were not reflecting a 0-1 index response for
|
||||
degradation was because the convolution approach ends up calculating the
|
||||
impact of each threat ($i_{rxy}$ in degradation equation) to be a very small
|
||||
number even if the distance is very small (meaning the pixel is close to the
|
||||
threat).
|
||||
|
||||
We also investigated why the exponential decay equation was using a constant
|
||||
of 2.99 as a scalar. With the constant of 2.99, the impact of the threat is
|
||||
reduced by 95% (to 5%) at the specified max threat distance. So we suspect
|
||||
it's based on the traditional 95% cutoff that used in statistics. We could
|
||||
tweak this cutoff (e.g., 99% decay at max distance), if we wanted.
|
||||
|
||||
## Decision
|
||||
After talking things over with the science team (Lisa, Stacie, Jade) we
|
||||
decided to switch to a simpler euclidean distance implementation and to
|
||||
update the User's Guide with why the 2.99 constant is being used.
|
||||
|
||||
## Status
|
||||
Completed and released in InVEST 3.13.0 (2023-03-17)
|
||||
|
||||
## Consequences
|
||||
The degradation and quality outputs will be quite different from previous
|
||||
versions but should be more intuitive to interpret.
|
||||
|
||||
We should see less user forum questions regarding this topic.
|
||||
|
||||
There should be a noticeable runtime improvement from calculating euclidean
|
||||
distances vs convolutions.
|
||||
|
||||
## References
|
||||
GitHub:
|
||||
* [Pull Request](https://github.com/natcap/invest/pull/1159)
|
||||
* [Degradation range discussion](https://github.com/natcap/invest/issues/646)
|
||||
* [Decay function discrepency](https://github.com/natcap/invest/issues/1104)
|
||||
* [Users Guide PR](https://github.com/natcap/invest.users-guide/pull/109)
|
|
@ -0,0 +1,12 @@
|
|||
# Architecture/Any Decision Records
|
||||
|
||||
An ADR is a way to track decisions and their rationale in a way that is tied to
|
||||
the source code, easy to digest, and written in a way that future us will
|
||||
understand. An ADR consists of several sections:
|
||||
|
||||
1. The title and ADR number (for easier sorting)
|
||||
2. Context about the problem
|
||||
3. The decision that was made and why
|
||||
4. The status of implementation
|
||||
5. Consequences of the implementation
|
||||
6. Any references (especially if describing a science/software issue)
|
|
@ -21,8 +21,6 @@ pygeoprocessing>=2.4.0 # pip-only
|
|||
taskgraph[niced_processes]>=0.11.0 # pip-only
|
||||
psutil>=5.6.6
|
||||
chardet>=3.0.4
|
||||
openpyxl
|
||||
xlrd
|
||||
pint
|
||||
Babel
|
||||
Flask
|
||||
|
|
|
@ -429,12 +429,12 @@ MODEL_SPEC = {
|
|||
"veg.tif": {
|
||||
"about": "Map of vegetated state.",
|
||||
"bands": {1: {"type": "integer"}},
|
||||
},
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -596,7 +596,6 @@ def execute(args):
|
|||
seasonality_constant = float(args['seasonality_constant'])
|
||||
|
||||
# Initialize a TaskGraph
|
||||
work_token_dir = os.path.join(intermediate_dir, '_taskgraph_working_dir')
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
except (KeyError, ValueError, TypeError):
|
||||
|
@ -604,7 +603,8 @@ def execute(args):
|
|||
# ValueError when n_workers is an empty string.
|
||||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # single process mode.
|
||||
graph = taskgraph.TaskGraph(work_token_dir, n_workers)
|
||||
graph = taskgraph.TaskGraph(
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'), n_workers)
|
||||
|
||||
base_raster_path_list = [
|
||||
args['eto_path'],
|
||||
|
|
|
@ -255,10 +255,10 @@ MODEL_SPEC = {
|
|||
"intermediate": {
|
||||
"type": "directory",
|
||||
"contents": {
|
||||
**CARBON_OUTPUTS,
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR
|
||||
**CARBON_OUTPUTS
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -370,8 +370,6 @@ def execute(args):
|
|||
carbon_pool_df = utils.read_csv_to_dataframe(
|
||||
args['carbon_pools_path'], MODEL_SPEC['args']['carbon_pools_path'])
|
||||
|
||||
work_token_dir = os.path.join(
|
||||
intermediate_output_dir, '_taskgraph_working_dir')
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
except (KeyError, ValueError, TypeError):
|
||||
|
@ -379,7 +377,8 @@ def execute(args):
|
|||
# ValueError when n_workers is an empty string.
|
||||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # Synchronous mode.
|
||||
graph = taskgraph.TaskGraph(work_token_dir, n_workers)
|
||||
graph = taskgraph.TaskGraph(
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'), n_workers)
|
||||
|
||||
cell_size_set = set()
|
||||
raster_size_set = set()
|
||||
|
|
|
@ -118,6 +118,9 @@ INVALID_ANALYSIS_YEAR_MSG = gettext(
|
|||
"({latest_year})")
|
||||
INVALID_SNAPSHOT_RASTER_MSG = gettext(
|
||||
"Raster for snapshot {snapshot_year} could not be validated.")
|
||||
INVALID_TRANSITION_VALUES_MSG = gettext(
|
||||
"The transition table expects values of {model_transitions} but found "
|
||||
"values of {transition_values}.")
|
||||
|
||||
POOL_SOIL = 'soil'
|
||||
POOL_BIOMASS = 'biomass'
|
||||
|
@ -155,7 +158,6 @@ NET_PRESENT_VALUE_RASTER_PATTERN = 'net-present-value-at-{year}{suffix}.tif'
|
|||
CARBON_STOCK_AT_YEAR_RASTER_PATTERN = 'carbon-stock-at-{year}{suffix}.tif'
|
||||
|
||||
INTERMEDIATE_DIR_NAME = 'intermediate'
|
||||
TASKGRAPH_CACHE_DIR_NAME = 'task_cache'
|
||||
OUTPUT_DIR_NAME = 'output'
|
||||
|
||||
MODEL_SPEC = {
|
||||
|
@ -521,7 +523,7 @@ MODEL_SPEC = {
|
|||
}
|
||||
}
|
||||
},
|
||||
"task_cache": spec_utils.TASKGRAPH_DIR
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -1066,10 +1068,9 @@ def _set_up_workspace(args):
|
|||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # Synchronous mode.
|
||||
|
||||
taskgraph_cache_dir = os.path.join(
|
||||
args['workspace_dir'], TASKGRAPH_CACHE_DIR_NAME)
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
taskgraph_cache_dir, n_workers, reporting_interval=5.0)
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'),
|
||||
n_workers, reporting_interval=5.0)
|
||||
|
||||
suffix = utils.make_suffix_string(args, 'results_suffix')
|
||||
intermediate_dir = os.path.join(
|
||||
|
@ -1077,7 +1078,7 @@ def _set_up_workspace(args):
|
|||
output_dir = os.path.join(
|
||||
args['workspace_dir'], OUTPUT_DIR_NAME)
|
||||
|
||||
utils.make_directories([output_dir, intermediate_dir, taskgraph_cache_dir])
|
||||
utils.make_directories([output_dir, intermediate_dir])
|
||||
|
||||
return task_graph, n_workers, intermediate_dir, output_dir, suffix
|
||||
|
||||
|
@ -2259,4 +2260,26 @@ def validate(args, limit_to=None):
|
|||
analysis_year=args['analysis_year'],
|
||||
latest_year=max(snapshots.keys()))))
|
||||
|
||||
# check for invalid options in the translation table
|
||||
if ("landcover_transitions_table" not in invalid_keys and
|
||||
"landcover_transitions_table" in sufficient_keys):
|
||||
transitions_spec = MODEL_SPEC['args']['landcover_transitions_table']
|
||||
transition_options = list(
|
||||
transitions_spec['columns']['[LULC CODE]']['options'].keys())
|
||||
# lowercase options since utils call will lowercase table values
|
||||
transition_options = [x.lower() for x in transition_options]
|
||||
transitions_df = utils.read_csv_to_dataframe(
|
||||
args['landcover_transitions_table'], transitions_spec)
|
||||
transitions_mask = ~transitions_df.isin(transition_options) & ~transitions_df.isna()
|
||||
if transitions_mask.any(axis=None):
|
||||
transition_numpy_mask = transitions_mask.values
|
||||
transition_numpy_values = transitions_df.to_numpy()
|
||||
bad_transition_values = list(
|
||||
numpy.unique(transition_numpy_values[transition_numpy_mask]))
|
||||
validation_warnings.append((
|
||||
['landcover_transitions_table'],
|
||||
INVALID_TRANSITION_VALUES_MSG.format(
|
||||
model_transitions=(transition_options),
|
||||
transition_values=bad_transition_values)))
|
||||
|
||||
return validation_warnings
|
||||
|
|
|
@ -134,7 +134,7 @@ MODEL_SPEC = {
|
|||
"to match all the other LULC maps."),
|
||||
"bands": {1: {"type": "integer"}}
|
||||
},
|
||||
"task_cache": spec_utils.TASKGRAPH_DIR
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -167,8 +167,7 @@ def execute(args):
|
|||
"""
|
||||
suffix = utils.make_suffix_string(args, 'results_suffix')
|
||||
output_dir = os.path.join(args['workspace_dir'], 'outputs_preprocessor')
|
||||
taskgraph_cache_dir = os.path.join(args['workspace_dir'], 'task_cache')
|
||||
utils.make_directories([output_dir, taskgraph_cache_dir])
|
||||
utils.make_directories([output_dir])
|
||||
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
|
@ -178,7 +177,8 @@ def execute(args):
|
|||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # Synchronous mode.
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
taskgraph_cache_dir, n_workers, reporting_interval=5.0)
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'),
|
||||
n_workers, reporting_interval=5.0)
|
||||
|
||||
snapshots_dict = utils.read_csv_to_dataframe(
|
||||
args['landcover_snapshot_csv'],
|
||||
|
|
|
@ -682,10 +682,10 @@ MODEL_SPEC = {
|
|||
"fields": WWIII_FIELDS
|
||||
}
|
||||
}
|
||||
},
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -793,8 +793,6 @@ def execute(args):
|
|||
geomorph_dir, wind_wave_dir, surge_dir, population_dir, slr_dir])
|
||||
file_suffix = utils.make_suffix_string(args, 'results_suffix')
|
||||
|
||||
taskgraph_cache_dir = os.path.join(
|
||||
intermediate_dir, '_taskgraph_working_dir')
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
except (KeyError, ValueError, TypeError):
|
||||
|
@ -802,7 +800,8 @@ def execute(args):
|
|||
# ValueError when n_workers is an empty string.
|
||||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # Single process mode.
|
||||
task_graph = taskgraph.TaskGraph(taskgraph_cache_dir, n_workers)
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'), n_workers)
|
||||
|
||||
model_resolution = float(args['model_resolution'])
|
||||
max_fetch_distance = float(args['max_fetch_distance'])
|
||||
|
|
|
@ -362,10 +362,10 @@ MODEL_SPEC = {
|
|||
"bands": {1: {
|
||||
"type": "number", "units": u.metric_ton/u.hectare
|
||||
}}
|
||||
},
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -510,8 +510,6 @@ def execute(args):
|
|||
edge_samples=11)
|
||||
|
||||
# Initialize a TaskGraph
|
||||
work_token_dir = os.path.join(
|
||||
output_dir, _INTERMEDIATE_OUTPUT_DIR, '_taskgraph_working_dir')
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
except (KeyError, ValueError, TypeError):
|
||||
|
@ -519,7 +517,8 @@ def execute(args):
|
|||
# ValueError when n_workers is an empty string.
|
||||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # Single process mode.
|
||||
task_graph = taskgraph.TaskGraph(work_token_dir, n_workers)
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
os.path.join(output_dir, 'taskgraph_cache'), n_workers)
|
||||
dependent_task_list = []
|
||||
|
||||
crop_lucode = None
|
||||
|
|
|
@ -319,10 +319,10 @@ MODEL_SPEC = {
|
|||
"bands": {1: {
|
||||
"type": "number", "units": u.metric_ton/u.hectare
|
||||
}}
|
||||
},
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -483,8 +483,6 @@ def execute(args):
|
|||
output_dir, os.path.join(output_dir, _INTERMEDIATE_OUTPUT_DIR)])
|
||||
|
||||
# Initialize a TaskGraph
|
||||
work_token_dir = os.path.join(
|
||||
output_dir, _INTERMEDIATE_OUTPUT_DIR, '_taskgraph_working_dir')
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
except (KeyError, ValueError, TypeError):
|
||||
|
@ -492,7 +490,8 @@ def execute(args):
|
|||
# ValueError when n_workers is an empty string.
|
||||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # Single process mode.
|
||||
task_graph = taskgraph.TaskGraph(work_token_dir, n_workers)
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
os.path.join(output_dir, 'taskgraph_cache'), n_workers)
|
||||
dependent_task_list = []
|
||||
|
||||
LOGGER.info(
|
||||
|
|
|
@ -137,7 +137,7 @@ MODEL_SPEC = {
|
|||
"geometries": spec_utils.POINT,
|
||||
"fields": {}
|
||||
},
|
||||
"_work_tokens": spec_utils.TASKGRAPH_DIR
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -221,8 +221,6 @@ def execute(args):
|
|||
file_registry = utils.build_file_registry(
|
||||
[(_OUTPUT_FILES, output_directory)], file_suffix)
|
||||
|
||||
work_token_dir = os.path.join(output_directory, '_work_tokens')
|
||||
|
||||
# Manually setting n_workers to be -1 so that everything happens in the
|
||||
# same thread.
|
||||
try:
|
||||
|
@ -232,7 +230,8 @@ def execute(args):
|
|||
# ValueError when n_workers is an empty string.
|
||||
# TypeError when n_workers is None.
|
||||
n_workers = -1
|
||||
graph = taskgraph.TaskGraph(work_token_dir, n_workers=n_workers)
|
||||
graph = taskgraph.TaskGraph(
|
||||
os.path.join(output_directory, '_work_tokens'), n_workers=n_workers)
|
||||
|
||||
fill_pits_task = graph.add_task(
|
||||
pygeoprocessing.routing.fill_pits,
|
||||
|
|
|
@ -251,10 +251,10 @@ MODEL_SPEC = {
|
|||
"bands": {1: {
|
||||
"type": "number", "units": u.metric_ton/u.hectare
|
||||
}}
|
||||
},
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -371,8 +371,6 @@ def execute(args):
|
|||
file_suffix = utils.make_suffix_string(args, 'results_suffix')
|
||||
|
||||
# Initialize a TaskGraph
|
||||
taskgraph_working_dir = os.path.join(
|
||||
intermediate_dir, '_taskgraph_working_dir')
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
except (KeyError, ValueError, TypeError):
|
||||
|
@ -380,7 +378,8 @@ def execute(args):
|
|||
# ValueError when n_workers is an empty string.
|
||||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # single process mode.
|
||||
task_graph = taskgraph.TaskGraph(taskgraph_working_dir, n_workers)
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'), n_workers)
|
||||
|
||||
# used to keep track of files generated by this module
|
||||
output_file_registry = {
|
||||
|
|
|
@ -309,10 +309,10 @@ MODEL_SPEC = {
|
|||
"bands": {1: {"type": "integer"}}
|
||||
}
|
||||
}
|
||||
},
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
# All out rasters besides rarity should be gte to 0. Set nodata accordingly.
|
||||
|
@ -377,11 +377,9 @@ def execute(args):
|
|||
args['workspace_dir'], 'intermediate')
|
||||
utils.make_directories([intermediate_output_dir, output_dir])
|
||||
|
||||
taskgraph_working_dir = os.path.join(
|
||||
intermediate_output_dir, '_taskgraph_working_dir')
|
||||
|
||||
n_workers = int(args.get('n_workers', -1))
|
||||
task_graph = taskgraph.TaskGraph(taskgraph_working_dir, n_workers)
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'), n_workers)
|
||||
|
||||
LOGGER.info("Checking Threat and Sensitivity tables for compliance")
|
||||
# Get CSVs as dictionaries and ensure the key is a string for threats.
|
||||
|
|
|
@ -374,53 +374,60 @@ MODEL_SPEC = {
|
|||
"type": "integer"
|
||||
}}
|
||||
},
|
||||
"cache_dir": {
|
||||
"type": "directory",
|
||||
"contents": {
|
||||
"aligned_dem.tif": {
|
||||
"about": "Copy of the DEM clipped to the extent of the other inputs",
|
||||
"bands": {1: {"type": "number", "units": u.meter}}
|
||||
},
|
||||
"aligned_lulc.tif": {
|
||||
"about": (
|
||||
"Copy of the LULC clipped to the extent of the other inputs "
|
||||
"and reprojected to the DEM projection"),
|
||||
"bands": {1: {"type": "integer"}}
|
||||
},
|
||||
"aligned_runoff_proxy.tif": {
|
||||
"about": (
|
||||
"Copy of the runoff proxy clipped to the extent of the other inputs "
|
||||
"and reprojected to the DEM projection"),
|
||||
"bands": {1: {"type": "number", "units": u.none}}
|
||||
},
|
||||
"filled_dem.tif": spec_utils.FILLED_DEM,
|
||||
"slope.tif": spec_utils.SLOPE,
|
||||
"subsurface_export_n.pickle": {
|
||||
"about": "Pickled zonal statistics of nitrogen subsurface export"
|
||||
},
|
||||
"subsurface_load_n.pickle": {
|
||||
"about": "Pickled zonal statistics of nitrogen subsurface load"
|
||||
},
|
||||
"surface_export_n.pickle": {
|
||||
"about": "Pickled zonal statistics of nitrogen surface export"
|
||||
},
|
||||
"surface_export_p.pickle": {
|
||||
"about": "Pickled zonal statistics of phosphorus surface export"
|
||||
},
|
||||
"surface_load_n.pickle": {
|
||||
"about": "Pickled zonal statistics of nitrogen surface load"
|
||||
},
|
||||
"surface_load_p.pickle": {
|
||||
"about": "Pickled zonal statistics of phosphorus surface load"
|
||||
},
|
||||
"total_export_n.pickle": {
|
||||
"about": "Pickled zonal statistics of total nitrogen export"
|
||||
},
|
||||
"taskgraph.db": {}
|
||||
}
|
||||
"aligned_dem.tif": {
|
||||
"about": "Copy of the DEM clipped to the extent of the other inputs",
|
||||
"bands": {1: {"type": "number", "units": u.meter}}
|
||||
},
|
||||
"aligned_lulc.tif": {
|
||||
"about": (
|
||||
"Copy of the LULC clipped to the extent of the other inputs "
|
||||
"and reprojected to the DEM projection"),
|
||||
"bands": {1: {"type": "integer"}}
|
||||
},
|
||||
"aligned_runoff_proxy.tif": {
|
||||
"about": (
|
||||
"Copy of the runoff proxy clipped to the extent of the other inputs "
|
||||
"and reprojected to the DEM projection"),
|
||||
"bands": {1: {"type": "number", "units": u.none}}
|
||||
},
|
||||
"masked_dem.tif": {
|
||||
"about": "DEM input masked to exclude pixels outside the watershed",
|
||||
"bands": {1: {"type": "number", "units": u.meter}}
|
||||
},
|
||||
"masked_lulc.tif": {
|
||||
"about": "LULC input masked to exclude pixels outside the watershed",
|
||||
"bands": {1: {"type": "integer"}}
|
||||
},
|
||||
"masked_runoff_proxy.tif": {
|
||||
"about": "Runoff proxy input masked to exclude pixels outside the watershed",
|
||||
"bands": {1: {"type": "number", "units": u.none}}
|
||||
},
|
||||
"filled_dem.tif": spec_utils.FILLED_DEM,
|
||||
"slope.tif": spec_utils.SLOPE,
|
||||
"subsurface_export_n.pickle": {
|
||||
"about": "Pickled zonal statistics of nitrogen subsurface export"
|
||||
},
|
||||
"subsurface_load_n.pickle": {
|
||||
"about": "Pickled zonal statistics of nitrogen subsurface load"
|
||||
},
|
||||
"surface_export_n.pickle": {
|
||||
"about": "Pickled zonal statistics of nitrogen surface export"
|
||||
},
|
||||
"surface_export_p.pickle": {
|
||||
"about": "Pickled zonal statistics of phosphorus surface export"
|
||||
},
|
||||
"surface_load_n.pickle": {
|
||||
"about": "Pickled zonal statistics of nitrogen surface load"
|
||||
},
|
||||
"surface_load_p.pickle": {
|
||||
"about": "Pickled zonal statistics of phosphorus surface load"
|
||||
},
|
||||
"total_export_n.pickle": {
|
||||
"about": "Pickled zonal statistics of total nitrogen export"
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -464,14 +471,14 @@ _INTERMEDIATE_BASE_FILES = {
|
|||
'thresholded_slope_path': 'thresholded_slope.tif',
|
||||
'dist_to_channel_path': 'dist_to_channel.tif',
|
||||
'drainage_mask': 'what_drains_to_stream.tif',
|
||||
}
|
||||
|
||||
_CACHE_BASE_FILES = {
|
||||
'filled_dem_path': 'filled_dem.tif',
|
||||
'aligned_dem_path': 'aligned_dem.tif',
|
||||
'masked_dem_path': 'masked_dem.tif',
|
||||
'slope_path': 'slope.tif',
|
||||
'aligned_lulc_path': 'aligned_lulc.tif',
|
||||
'masked_lulc_path': 'masked_lulc.tif',
|
||||
'aligned_runoff_proxy_path': 'aligned_runoff_proxy.tif',
|
||||
'masked_runoff_proxy_path': 'masked_runoff_proxy.tif',
|
||||
'surface_load_n_pickle_path': 'surface_load_n.pickle',
|
||||
'surface_load_p_pickle_path': 'surface_load_p.pickle',
|
||||
'subsurface_load_n_pickle_path': 'subsurface_load_n.pickle',
|
||||
|
@ -542,8 +549,7 @@ def execute(args):
|
|||
output_dir = os.path.join(args['workspace_dir'])
|
||||
intermediate_output_dir = os.path.join(
|
||||
args['workspace_dir'], INTERMEDIATE_DIR_NAME)
|
||||
cache_dir = os.path.join(intermediate_output_dir, 'cache_dir')
|
||||
utils.make_directories([output_dir, intermediate_output_dir, cache_dir])
|
||||
utils.make_directories([output_dir, intermediate_output_dir])
|
||||
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
|
@ -553,13 +559,13 @@ def execute(args):
|
|||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # Synchronous mode.
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
cache_dir, n_workers, reporting_interval=5.0)
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'),
|
||||
n_workers, reporting_interval=5.0)
|
||||
|
||||
file_suffix = utils.make_suffix_string(args, 'results_suffix')
|
||||
f_reg = utils.build_file_registry(
|
||||
[(_OUTPUT_BASE_FILES, output_dir),
|
||||
(_INTERMEDIATE_BASE_FILES, intermediate_output_dir),
|
||||
(_CACHE_BASE_FILES, cache_dir)], file_suffix)
|
||||
(_INTERMEDIATE_BASE_FILES, intermediate_output_dir)], file_suffix)
|
||||
|
||||
# Build up a list of nutrients to process based on what's checked on
|
||||
nutrients_to_process = []
|
||||
|
@ -593,18 +599,64 @@ def execute(args):
|
|||
base_raster_list, aligned_raster_list,
|
||||
['near']*len(base_raster_list), dem_info['pixel_size'],
|
||||
'intersection'),
|
||||
kwargs={
|
||||
'base_vector_path_list': [args['watersheds_path']],
|
||||
'vector_mask_options': {
|
||||
'mask_vector_path': args['watersheds_path']}},
|
||||
kwargs={'base_vector_path_list': [args['watersheds_path']]},
|
||||
target_path_list=aligned_raster_list,
|
||||
task_name='align rasters')
|
||||
|
||||
# Use the cutline feature of gdal.Warp to mask pixels outside the watershed
|
||||
# it's possible that the DEM, LULC, or runoff proxy inputs might have an
|
||||
# undefined nodata value. since we're introducing nodata pixels, set a nodata
|
||||
# value if one is not already defined.
|
||||
rp_nodata = pygeoprocessing.get_raster_info(
|
||||
f_reg['aligned_runoff_proxy_path'])['nodata'][0]
|
||||
mask_runoff_proxy_task = task_graph.add_task(
|
||||
func=gdal.Warp,
|
||||
kwargs={
|
||||
'destNameOrDestDS': f_reg['masked_runoff_proxy_path'],
|
||||
'srcDSOrSrcDSTab': f_reg['aligned_runoff_proxy_path'],
|
||||
'dstNodata': _TARGET_NODATA if rp_nodata is None else rp_nodata,
|
||||
'cutlineDSName': args['watersheds_path']},
|
||||
dependent_task_list=[align_raster_task],
|
||||
target_path_list=[f_reg['masked_runoff_proxy_path']],
|
||||
task_name='mask runoff proxy raster')
|
||||
|
||||
dem_nodata = pygeoprocessing.get_raster_info(
|
||||
f_reg['aligned_dem_path'])['nodata'][0]
|
||||
dem_target_nodata = float( # GDAL expects a python float, not numpy.float32
|
||||
numpy.finfo(numpy.float32).min if dem_nodata is None else dem_nodata)
|
||||
mask_dem_task = task_graph.add_task(
|
||||
func=gdal.Warp,
|
||||
kwargs={
|
||||
'destNameOrDestDS': f_reg['masked_dem_path'],
|
||||
'srcDSOrSrcDSTab': f_reg['aligned_dem_path'],
|
||||
'outputType': gdal.GDT_Float32,
|
||||
'dstNodata': dem_target_nodata,
|
||||
'cutlineDSName': args['watersheds_path']},
|
||||
dependent_task_list=[align_raster_task],
|
||||
target_path_list=[f_reg['masked_dem_path']],
|
||||
task_name='mask dem raster')
|
||||
|
||||
lulc_nodata = pygeoprocessing.get_raster_info(
|
||||
f_reg['aligned_lulc_path'])['nodata'][0]
|
||||
lulc_target_nodata = (
|
||||
numpy.iinfo(numpy.int32).min if lulc_nodata is None else lulc_nodata)
|
||||
mask_lulc_task = task_graph.add_task(
|
||||
func=gdal.Warp,
|
||||
kwargs={
|
||||
'destNameOrDestDS': f_reg['masked_lulc_path'],
|
||||
'srcDSOrSrcDSTab': f_reg['aligned_lulc_path'],
|
||||
'outputType': gdal.GDT_Int32,
|
||||
'dstNodata': lulc_target_nodata,
|
||||
'cutlineDSName': args['watersheds_path']},
|
||||
dependent_task_list=[align_raster_task],
|
||||
target_path_list=[f_reg['masked_lulc_path']],
|
||||
task_name='mask lulc raster')
|
||||
|
||||
fill_pits_task = task_graph.add_task(
|
||||
func=pygeoprocessing.routing.fill_pits,
|
||||
args=(
|
||||
(f_reg['aligned_dem_path'], 1), f_reg['filled_dem_path']),
|
||||
kwargs={'working_dir': cache_dir},
|
||||
(f_reg['masked_dem_path'], 1), f_reg['filled_dem_path']),
|
||||
kwargs={'working_dir': intermediate_output_dir},
|
||||
dependent_task_list=[align_raster_task],
|
||||
target_path_list=[f_reg['filled_dem_path']],
|
||||
task_name='fill pits')
|
||||
|
@ -613,7 +665,7 @@ def execute(args):
|
|||
func=pygeoprocessing.routing.flow_dir_mfd,
|
||||
args=(
|
||||
(f_reg['filled_dem_path'], 1), f_reg['flow_direction_path']),
|
||||
kwargs={'working_dir': cache_dir},
|
||||
kwargs={'working_dir': intermediate_output_dir},
|
||||
dependent_task_list=[fill_pits_task],
|
||||
target_path_list=[f_reg['flow_direction_path']],
|
||||
task_name='flow dir')
|
||||
|
@ -654,7 +706,7 @@ def execute(args):
|
|||
|
||||
runoff_proxy_index_task = task_graph.add_task(
|
||||
func=_normalize_raster,
|
||||
args=((f_reg['aligned_runoff_proxy_path'], 1),
|
||||
args=((f_reg['masked_runoff_proxy_path'], 1),
|
||||
f_reg['runoff_proxy_index_path']),
|
||||
target_path_list=[f_reg['runoff_proxy_index_path']],
|
||||
dependent_task_list=[align_raster_task],
|
||||
|
@ -744,7 +796,7 @@ def execute(args):
|
|||
load_task = task_graph.add_task(
|
||||
func=_calculate_load,
|
||||
args=(
|
||||
f_reg['aligned_lulc_path'],
|
||||
f_reg['masked_lulc_path'],
|
||||
biophysical_df[f'load_{nutrient}'],
|
||||
load_path),
|
||||
dependent_task_list=[align_raster_task],
|
||||
|
@ -762,7 +814,7 @@ def execute(args):
|
|||
surface_load_path = f_reg[f'surface_load_{nutrient}_path']
|
||||
surface_load_task = task_graph.add_task(
|
||||
func=_map_surface_load,
|
||||
args=(modified_load_path, f_reg['aligned_lulc_path'],
|
||||
args=(modified_load_path, f_reg['masked_lulc_path'],
|
||||
subsurface_proportion_map, surface_load_path),
|
||||
target_path_list=[surface_load_path],
|
||||
dependent_task_list=[modified_load_task, align_raster_task],
|
||||
|
@ -772,7 +824,7 @@ def execute(args):
|
|||
eff_task = task_graph.add_task(
|
||||
func=_map_lulc_to_val_mask_stream,
|
||||
args=(
|
||||
f_reg['aligned_lulc_path'], f_reg['stream_path'],
|
||||
f_reg['masked_lulc_path'], f_reg['stream_path'],
|
||||
biophysical_df[f'eff_{nutrient}'].to_dict(), eff_path),
|
||||
target_path_list=[eff_path],
|
||||
dependent_task_list=[align_raster_task, stream_extraction_task],
|
||||
|
@ -782,7 +834,7 @@ def execute(args):
|
|||
crit_len_task = task_graph.add_task(
|
||||
func=_map_lulc_to_val_mask_stream,
|
||||
args=(
|
||||
f_reg['aligned_lulc_path'], f_reg['stream_path'],
|
||||
f_reg['masked_lulc_path'], f_reg['stream_path'],
|
||||
biophysical_df[f'crit_len_{nutrient}'].to_dict(),
|
||||
crit_len_path),
|
||||
target_path_list=[crit_len_path],
|
||||
|
@ -832,7 +884,7 @@ def execute(args):
|
|||
biophysical_df['proportion_subsurface_n'].to_dict())
|
||||
subsurface_load_task = task_graph.add_task(
|
||||
func=_map_subsurface_load,
|
||||
args=(modified_load_path, f_reg['aligned_lulc_path'],
|
||||
args=(modified_load_path, f_reg['masked_lulc_path'],
|
||||
proportion_subsurface_map, f_reg['sub_load_n_path']),
|
||||
target_path_list=[f_reg['sub_load_n_path']],
|
||||
dependent_task_list=[modified_load_task, align_raster_task],
|
||||
|
|
|
@ -311,10 +311,10 @@ MODEL_SPEC = {
|
|||
"about": "Farm vector reprojected to the LULC projection",
|
||||
"fields": {},
|
||||
"geometries": spec_utils.POLYGONS
|
||||
},
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -504,8 +504,6 @@ def execute(args):
|
|||
# create initial working directories and determine file suffixes
|
||||
intermediate_output_dir = os.path.join(
|
||||
args['workspace_dir'], 'intermediate_outputs')
|
||||
work_token_dir = os.path.join(
|
||||
intermediate_output_dir, '_taskgraph_working_dir')
|
||||
output_dir = os.path.join(args['workspace_dir'])
|
||||
utils.make_directories(
|
||||
[output_dir, intermediate_output_dir])
|
||||
|
@ -534,7 +532,8 @@ def execute(args):
|
|||
# ValueError when n_workers is an empty string.
|
||||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # Synchronous mode.
|
||||
task_graph = taskgraph.TaskGraph(work_token_dir, n_workers)
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'), n_workers)
|
||||
|
||||
if farm_vector_path is not None:
|
||||
# ensure farm vector is in the same projection as the landcover map
|
||||
|
|
|
@ -77,7 +77,7 @@ predictor_table_columns = {
|
|||
"point_nearest_distance": {
|
||||
"description": gettext(
|
||||
"Predictor is a point vector. Metric is the Euclidean "
|
||||
"distance between the center of each AOI grid cell and "
|
||||
"distance between the centroid of each AOI grid cell and "
|
||||
"the nearest point in this layer.")},
|
||||
"line_intersect_length": {
|
||||
"description": gettext(
|
||||
|
@ -331,10 +331,10 @@ MODEL_SPEC = {
|
|||
},
|
||||
"server_version.pickle": {
|
||||
"about": gettext("Server version info")
|
||||
},
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -417,7 +417,7 @@ def execute(args):
|
|||
* 'point_count': count of the points contained in the
|
||||
response polygon
|
||||
* 'point_nearest_distance': distance to the nearest point
|
||||
from the response polygon
|
||||
from the centroid of the response polygon
|
||||
* 'line_intersect_length': length of lines that intersect
|
||||
with the response polygon in projected units of AOI
|
||||
* 'polygon_area': area of the polygon contained within
|
||||
|
@ -460,7 +460,6 @@ def execute(args):
|
|||
(_INTERMEDIATE_BASE_FILES, intermediate_dir)], file_suffix)
|
||||
|
||||
# Initialize a TaskGraph
|
||||
taskgraph_db_dir = os.path.join(intermediate_dir, '_taskgraph_working_dir')
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
except (KeyError, ValueError, TypeError):
|
||||
|
@ -468,7 +467,8 @@ def execute(args):
|
|||
# ValueError when n_workers is an empty string.
|
||||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # single process mode.
|
||||
task_graph = taskgraph.TaskGraph(taskgraph_db_dir, n_workers)
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
os.path.join(output_dir, 'taskgraph_cache'), n_workers)
|
||||
|
||||
if args['grid_aoi']:
|
||||
prep_aoi_task = task_graph.add_task(
|
||||
|
@ -1151,7 +1151,7 @@ def _line_intersect_length(
|
|||
def _point_nearest_distance(
|
||||
response_polygons_pickle_path, point_vector_path,
|
||||
predictor_target_path):
|
||||
"""Calculate distance to nearest point for all polygons.
|
||||
"""Calculate distance to nearest point for the centroid of all polygons.
|
||||
|
||||
Args:
|
||||
response_polygons_pickle_path (str): path to a pickled dictionary which
|
||||
|
@ -1181,7 +1181,7 @@ def _point_nearest_distance(
|
|||
f"{(100*index)/len(response_polygons_lookup):.2f}% complete"))
|
||||
|
||||
point_distance_lookup[str(feature_id)] = min([
|
||||
geometry.distance(point) for point in points])
|
||||
geometry.centroid.distance(point) for point in points])
|
||||
LOGGER.info(f"{os.path.basename(point_vector_path)} point distance: "
|
||||
"100.00% complete")
|
||||
with open(predictor_target_path, 'w') as jsonfile:
|
||||
|
|
|
@ -107,7 +107,7 @@ MODEL_SPEC = {
|
|||
},
|
||||
},
|
||||
"outputs": {
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR,
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR,
|
||||
"filled.tif": spec_utils.FILLED_DEM,
|
||||
"flow_accumulation.tif": spec_utils.FLOW_ACCUMULATION,
|
||||
"flow_direction.tif": spec_utils.FLOW_DIRECTION,
|
||||
|
@ -341,8 +341,7 @@ def execute(args):
|
|||
``None``
|
||||
"""
|
||||
file_suffix = utils.make_suffix_string(args, 'results_suffix')
|
||||
task_cache_dir = os.path.join(args['workspace_dir'], '_taskgraph_working_dir')
|
||||
utils.make_directories([args['workspace_dir'], task_cache_dir])
|
||||
utils.make_directories([args['workspace_dir']])
|
||||
|
||||
if ('calculate_flow_direction' in args and
|
||||
bool(args['calculate_flow_direction'])):
|
||||
|
@ -365,7 +364,8 @@ def execute(args):
|
|||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # Synchronous mode.
|
||||
|
||||
graph = taskgraph.TaskGraph(task_cache_dir, n_workers=n_workers)
|
||||
graph = taskgraph.TaskGraph(
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'), n_workers=n_workers)
|
||||
|
||||
# Calculate slope. This is intentionally on the original DEM, not
|
||||
# on the pitfilled DEM. If the user really wants the slop of the filled
|
||||
|
|
|
@ -177,10 +177,10 @@ MODEL_SPEC = {
|
|||
"Map of the distance from each pixel to the nearest "
|
||||
"edge of the focal landcover."),
|
||||
"bands": {1: {"type": "number", "units": u.pixel}}
|
||||
},
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -253,8 +253,6 @@ def execute(args):
|
|||
utils.make_directories(
|
||||
[output_dir, intermediate_output_dir, tmp_dir])
|
||||
|
||||
work_token_dir = os.path.join(
|
||||
intermediate_output_dir, '_taskgraph_working_dir')
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
except (KeyError, ValueError, TypeError):
|
||||
|
@ -262,7 +260,8 @@ def execute(args):
|
|||
# ValueError when n_workers is an empty string.
|
||||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # Single process mode.
|
||||
task_graph = taskgraph.TaskGraph(work_token_dir, n_workers)
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'), n_workers)
|
||||
|
||||
area_to_convert = float(args['area_to_convert'])
|
||||
replacement_lucode = int(args['replacement_lucode'])
|
||||
|
|
|
@ -352,57 +352,52 @@ MODEL_SPEC = {
|
|||
"times the thresholded slope (in eq. (74))"),
|
||||
"bands": {1: {"type": "ratio"}}
|
||||
},
|
||||
"churn_dir_not_for_humans": {
|
||||
"type": "directory",
|
||||
"contents": {
|
||||
"aligned_dem.tif": {
|
||||
"about": gettext(
|
||||
"Copy of the input DEM, clipped to the extent "
|
||||
"of the other raster inputs."),
|
||||
"bands": {1: {
|
||||
"type": "number",
|
||||
"units": u.meter
|
||||
}}
|
||||
},
|
||||
"aligned_drainage.tif": {
|
||||
"about": gettext(
|
||||
"Copy of the input drainage map, clipped to "
|
||||
"the extent of the other raster inputs and "
|
||||
"aligned to the DEM."),
|
||||
"bands": {1: {"type": "integer"}},
|
||||
},
|
||||
"aligned_erodibility.tif": {
|
||||
"about": gettext(
|
||||
"Copy of the input erodibility map, clipped to "
|
||||
"the extent of the other raster inputs and "
|
||||
"aligned to the DEM."),
|
||||
"bands": {1: {
|
||||
"type": "number",
|
||||
"units": u.metric_ton*u.hectare*u.hour/(u.hectare*u.megajoule*u.millimeter)
|
||||
}}
|
||||
},
|
||||
"aligned_erosivity.tif": {
|
||||
"about": gettext(
|
||||
"Copy of the input erosivity map, clipped to "
|
||||
"the extent of the other raster inputs and "
|
||||
"aligned to the DEM."),
|
||||
"bands": {1: {
|
||||
"type": "number",
|
||||
"units": u.megajoule*u.millimeter/(u.hectare*u.hour*u.year)
|
||||
}}
|
||||
},
|
||||
"aligned_lulc.tif": {
|
||||
"about": gettext(
|
||||
"Copy of the input drainage map, clipped to "
|
||||
"the extent of the other raster inputs and "
|
||||
"aligned to the DEM."),
|
||||
"bands": {1: {"type": "integer"}},
|
||||
},
|
||||
"taskgraph.db": {}
|
||||
}
|
||||
"aligned_dem.tif": {
|
||||
"about": gettext(
|
||||
"Copy of the input DEM, clipped to the extent "
|
||||
"of the other raster inputs."),
|
||||
"bands": {1: {
|
||||
"type": "number",
|
||||
"units": u.meter
|
||||
}}
|
||||
},
|
||||
"aligned_drainage.tif": {
|
||||
"about": gettext(
|
||||
"Copy of the input drainage map, clipped to "
|
||||
"the extent of the other raster inputs and "
|
||||
"aligned to the DEM."),
|
||||
"bands": {1: {"type": "integer"}},
|
||||
},
|
||||
"aligned_erodibility.tif": {
|
||||
"about": gettext(
|
||||
"Copy of the input erodibility map, clipped to "
|
||||
"the extent of the other raster inputs and "
|
||||
"aligned to the DEM."),
|
||||
"bands": {1: {
|
||||
"type": "number",
|
||||
"units": u.metric_ton*u.hectare*u.hour/(u.hectare*u.megajoule*u.millimeter)
|
||||
}}
|
||||
},
|
||||
"aligned_erosivity.tif": {
|
||||
"about": gettext(
|
||||
"Copy of the input erosivity map, clipped to "
|
||||
"the extent of the other raster inputs and "
|
||||
"aligned to the DEM."),
|
||||
"bands": {1: {
|
||||
"type": "number",
|
||||
"units": u.megajoule*u.millimeter/(u.hectare*u.hour*u.year)
|
||||
}}
|
||||
},
|
||||
"aligned_lulc.tif": {
|
||||
"about": gettext(
|
||||
"Copy of the input drainage map, clipped to "
|
||||
"the extent of the other raster inputs and "
|
||||
"aligned to the DEM."),
|
||||
"bands": {1: {"type": "integer"}},
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -421,6 +416,11 @@ _OUTPUT_BASE_FILES = {
|
|||
INTERMEDIATE_DIR_NAME = 'intermediate_outputs'
|
||||
|
||||
_INTERMEDIATE_BASE_FILES = {
|
||||
'aligned_dem_path': 'aligned_dem.tif',
|
||||
'aligned_drainage_path': 'aligned_drainage.tif',
|
||||
'aligned_erodibility_path': 'aligned_erodibility.tif',
|
||||
'aligned_erosivity_path': 'aligned_erosivity.tif',
|
||||
'aligned_lulc_path': 'aligned_lulc.tif',
|
||||
'cp_factor_path': 'cp.tif',
|
||||
'd_dn_path': 'd_dn.tif',
|
||||
'd_up_path': 'd_up.tif',
|
||||
|
@ -441,17 +441,9 @@ _INTERMEDIATE_BASE_FILES = {
|
|||
'w_path': 'w.tif',
|
||||
'ws_inverse_path': 'ws_inverse.tif',
|
||||
'e_prime_path': 'e_prime.tif',
|
||||
'weighted_avg_aspect_path': 'weighted_avg_aspect.tif',
|
||||
'drainage_mask': 'what_drains_to_stream.tif',
|
||||
}
|
||||
|
||||
_TMP_BASE_FILES = {
|
||||
'aligned_dem_path': 'aligned_dem.tif',
|
||||
'aligned_drainage_path': 'aligned_drainage.tif',
|
||||
'aligned_erodibility_path': 'aligned_erodibility.tif',
|
||||
'aligned_erosivity_path': 'aligned_erosivity.tif',
|
||||
'aligned_lulc_path': 'aligned_lulc.tif',
|
||||
}
|
||||
|
||||
# Target nodata is for general rasters that are positive, and _IC_NODATA are
|
||||
# for rasters that are any range
|
||||
|
@ -518,14 +510,11 @@ def execute(args):
|
|||
intermediate_output_dir = os.path.join(
|
||||
args['workspace_dir'], INTERMEDIATE_DIR_NAME)
|
||||
output_dir = os.path.join(args['workspace_dir'])
|
||||
churn_dir = os.path.join(
|
||||
intermediate_output_dir, 'churn_dir_not_for_humans')
|
||||
utils.make_directories([output_dir, intermediate_output_dir, churn_dir])
|
||||
utils.make_directories([output_dir, intermediate_output_dir])
|
||||
|
||||
f_reg = utils.build_file_registry(
|
||||
[(_OUTPUT_BASE_FILES, output_dir),
|
||||
(_INTERMEDIATE_BASE_FILES, intermediate_output_dir),
|
||||
(_TMP_BASE_FILES, churn_dir)], file_suffix)
|
||||
(_INTERMEDIATE_BASE_FILES, intermediate_output_dir)], file_suffix)
|
||||
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
|
@ -535,7 +524,8 @@ def execute(args):
|
|||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # Synchronous mode.
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
churn_dir, n_workers, reporting_interval=5.0)
|
||||
os.path.join(output_dir, 'taskgraph_cache'),
|
||||
n_workers, reporting_interval=5.0)
|
||||
|
||||
base_list = []
|
||||
aligned_list = []
|
||||
|
@ -606,14 +596,6 @@ def execute(args):
|
|||
dependent_task_list=[pit_fill_task],
|
||||
task_name='flow direction calculation')
|
||||
|
||||
weighted_avg_aspect_task = task_graph.add_task(
|
||||
func=sdr_core.calculate_average_aspect,
|
||||
args=(f_reg['flow_direction_path'],
|
||||
f_reg['weighted_avg_aspect_path']),
|
||||
target_path_list=[f_reg['weighted_avg_aspect_path']],
|
||||
dependent_task_list=[flow_dir_task],
|
||||
task_name='weighted average of multiple-flow aspects')
|
||||
|
||||
flow_accumulation_task = task_graph.add_task(
|
||||
func=pygeoprocessing.routing.flow_accumulation_mfd,
|
||||
args=(
|
||||
|
@ -628,13 +610,11 @@ def execute(args):
|
|||
args=(
|
||||
f_reg['flow_accumulation_path'],
|
||||
f_reg['slope_path'],
|
||||
f_reg['weighted_avg_aspect_path'],
|
||||
float(args['l_max']),
|
||||
f_reg['ls_path']),
|
||||
target_path_list=[f_reg['ls_path']],
|
||||
dependent_task_list=[
|
||||
flow_accumulation_task, slope_task,
|
||||
weighted_avg_aspect_task],
|
||||
flow_accumulation_task, slope_task],
|
||||
task_name='ls factor calculation')
|
||||
|
||||
stream_task = task_graph.add_task(
|
||||
|
@ -1020,26 +1000,61 @@ def _calculate_what_drains_to_stream(
|
|||
|
||||
|
||||
def _calculate_ls_factor(
|
||||
flow_accumulation_path, slope_path, avg_aspect_path, l_max,
|
||||
target_ls_prime_factor_path):
|
||||
flow_accumulation_path, slope_path, l_max,
|
||||
target_ls_factor_path):
|
||||
"""Calculate LS factor.
|
||||
|
||||
Calculates a modified LS factor as Equation 3 from "Extension and
|
||||
Calculates the LS factor using Equation 3 from "Extension and
|
||||
validation of a geographic information system-based method for calculating
|
||||
the Revised Universal Soil Loss Equation length-slope factor for erosion
|
||||
risk assessments in large watersheds" where the ``x`` term is the average
|
||||
aspect ratio weighted by proportional flow to account for multiple flow
|
||||
direction.
|
||||
risk assessments in large watersheds".
|
||||
|
||||
The equation for this is::
|
||||
|
||||
(upstream_area + pixel_area)^(m+1) - upstream_area^(m+1)
|
||||
LS = S * --------------------------------------------------------
|
||||
(pixel_area^(m+2)) * aspect_dir * 22.13^(m)
|
||||
|
||||
Where
|
||||
|
||||
* ``S`` is the slope factor defined in equation 4 from the same paper,
|
||||
calculated by the following where ``b`` is the slope in radians:
|
||||
|
||||
* ``S = 10.8 * sin(b) + 0.03`` where slope < 9%
|
||||
* ``S = 16.8 * sin(b) - 0.50`` where slope >= 9%
|
||||
|
||||
* ``upstream_area`` is interpreted as the square root of the
|
||||
catchment area, to match SAGA-GIS's method for calculating LS
|
||||
Factor.
|
||||
* ``pixel_area`` is the area of the pixel in square meters.
|
||||
* ``m`` is the slope-length exponent of the RUSLE LS-factor,
|
||||
which, as discussed in Oliveira et al. 2013 is a function of the
|
||||
on-pixel slope theta:
|
||||
|
||||
* ``m = 0.2`` when ``theta <= 1%``
|
||||
* ``m = 0.3`` when ``1% < theta <= 3.5%``
|
||||
* ``m = 0.4`` when ``3.5% < theta <= 5%``
|
||||
* ``m = 0.5`` when ``5% < theta <= 9%``
|
||||
* ``m = (beta / (1+beta)`` when ``theta > 9%``, where
|
||||
``beta = (sin(theta) / 0.0896) / (3*sin(theta)^0.8 + 0.56)``
|
||||
|
||||
* ``aspect_dir`` is calculated by ``|sin(alpha)| + |cos(alpha)|``
|
||||
for the given pixel.
|
||||
|
||||
Oliveira et al can be found at:
|
||||
|
||||
Oliveira, A.H., Silva, M.A. da, Silva, M.L.N., Curi, N., Neto, G.K.,
|
||||
Freitas, D.A.F. de, 2013. Development of Topographic Factor Modeling
|
||||
for Application in Soil Erosion Models, in: Intechopen (Ed.), Soil
|
||||
Processes and Current Trends in Quality Assessment. p. 28.
|
||||
|
||||
Args:
|
||||
flow_accumulation_path (string): path to raster, pixel values are the
|
||||
contributing upslope area at that cell. Pixel size is square.
|
||||
slope_path (string): path to slope raster as a percent
|
||||
avg_aspect_path (string): The path to to raster of the weighted average
|
||||
of aspects based on proportional flow.
|
||||
l_max (float): if the calculated value of L exceeds this value
|
||||
it is clamped to this value.
|
||||
target_ls_prime_factor_path (string): path to output ls_prime_factor
|
||||
target_ls_factor_path (string): path to output ls_prime_factor
|
||||
raster
|
||||
|
||||
Returns:
|
||||
|
@ -1047,8 +1062,6 @@ def _calculate_ls_factor(
|
|||
|
||||
"""
|
||||
slope_nodata = pygeoprocessing.get_raster_info(slope_path)['nodata'][0]
|
||||
avg_aspect_nodata = pygeoprocessing.get_raster_info(
|
||||
avg_aspect_path)['nodata'][0]
|
||||
|
||||
flow_accumulation_info = pygeoprocessing.get_raster_info(
|
||||
flow_accumulation_path)
|
||||
|
@ -1056,14 +1069,12 @@ def _calculate_ls_factor(
|
|||
cell_size = abs(flow_accumulation_info['pixel_size'][0])
|
||||
cell_area = cell_size ** 2
|
||||
|
||||
def ls_factor_function(
|
||||
percent_slope, flow_accumulation, avg_aspect, l_max):
|
||||
"""Calculate the LS' factor.
|
||||
def ls_factor_function(percent_slope, flow_accumulation, l_max):
|
||||
"""Calculate the LS factor.
|
||||
|
||||
Args:
|
||||
percent_slope (numpy.ndarray): slope in percent
|
||||
flow_accumulation (numpy.ndarray): upslope pixels
|
||||
avg_aspect (numpy.ndarray): the weighted average aspect from MFD
|
||||
l_max (float): max L factor, clamp to this value if L exceeds it
|
||||
|
||||
Returns:
|
||||
|
@ -1073,16 +1084,27 @@ def _calculate_ls_factor(
|
|||
# avg aspect intermediate output should always have a defined
|
||||
# nodata value from pygeoprocessing
|
||||
valid_mask = (
|
||||
(~utils.array_equals_nodata(avg_aspect, avg_aspect_nodata)) &
|
||||
~utils.array_equals_nodata(percent_slope, slope_nodata) &
|
||||
~utils.array_equals_nodata(
|
||||
flow_accumulation, flow_accumulation_nodata))
|
||||
result = numpy.empty(valid_mask.shape, dtype=numpy.float32)
|
||||
result[:] = _TARGET_NODATA
|
||||
|
||||
contributing_area = (flow_accumulation[valid_mask]-1) * cell_area
|
||||
# Although Desmet & Govers (1996) discusses "upstream contributing
|
||||
# area", this is not strictly defined. We decided to use the square
|
||||
# root of the upstream contributing area here as an estimate, which
|
||||
# matches the SAGA LS Factor option "square root of catchment area".
|
||||
# See the InVEST ADR-0001 for more information.
|
||||
# We subtract 1 from the flow accumulation because FA includes itself
|
||||
# in its count of pixels upstream and our LS factor equation wants only
|
||||
# those pixels that are strictly upstream.
|
||||
contributing_area = numpy.sqrt(
|
||||
(flow_accumulation[valid_mask]-1) * cell_area)
|
||||
slope_in_radians = numpy.arctan(percent_slope[valid_mask] / 100.0)
|
||||
|
||||
aspect_length = (numpy.fabs(numpy.sin(slope_in_radians)) +
|
||||
numpy.fabs(numpy.cos(slope_in_radians)))
|
||||
|
||||
# From Equation 4 in "Extension and validation of a geographic
|
||||
# information system ..."
|
||||
slope_factor = numpy.where(
|
||||
|
@ -1112,7 +1134,7 @@ def _calculate_ls_factor(
|
|||
l_factor = (
|
||||
((contributing_area + cell_area)**(m_exp+1) -
|
||||
contributing_area ** (m_exp+1)) /
|
||||
((cell_size ** (m_exp + 2)) * (avg_aspect[valid_mask]**m_exp) *
|
||||
((cell_size ** (m_exp + 2)) * (aspect_length**m_exp) *
|
||||
(22.13**m_exp)))
|
||||
|
||||
# threshold L factor to l_max
|
||||
|
@ -1121,12 +1143,10 @@ def _calculate_ls_factor(
|
|||
result[valid_mask] = l_factor * slope_factor
|
||||
return result
|
||||
|
||||
# call vectorize datasets to calculate the ls_factor
|
||||
pygeoprocessing.raster_calculator(
|
||||
[(path, 1) for path in [
|
||||
slope_path, flow_accumulation_path, avg_aspect_path]] + [
|
||||
[(path, 1) for path in [slope_path, flow_accumulation_path]] + [
|
||||
(l_max, 'raw')],
|
||||
ls_factor_function, target_ls_prime_factor_path, gdal.GDT_Float32,
|
||||
ls_factor_function, target_ls_factor_path, gdal.GDT_Float32,
|
||||
_TARGET_NODATA)
|
||||
|
||||
|
||||
|
|
|
@ -675,127 +675,3 @@ def calculate_sediment_deposition(
|
|||
|
||||
LOGGER.info('Sediment deposition 100% complete')
|
||||
sediment_deposition_raster.close()
|
||||
|
||||
|
||||
def calculate_average_aspect(
|
||||
mfd_flow_direction_path, target_average_aspect_path):
|
||||
"""Calculate the Weighted Average Aspect Ratio from MFD.
|
||||
|
||||
Calculates the average aspect ratio weighted by proportional flow
|
||||
direction.
|
||||
|
||||
Args:
|
||||
mfd_flow_direction_path (string): The path to an MFD flow direction
|
||||
raster.
|
||||
target_average_aspect_path (string): The path to where the calculated
|
||||
weighted average aspect raster should be written.
|
||||
|
||||
Returns:
|
||||
``None``.
|
||||
|
||||
"""
|
||||
LOGGER.info('Calculating average aspect')
|
||||
|
||||
cdef float average_aspect_nodata = -1
|
||||
pygeoprocessing.new_raster_from_base(
|
||||
mfd_flow_direction_path, target_average_aspect_path,
|
||||
gdal.GDT_Float32, [average_aspect_nodata], [average_aspect_nodata])
|
||||
|
||||
flow_direction_info = pygeoprocessing.get_raster_info(
|
||||
mfd_flow_direction_path)
|
||||
cdef int mfd_flow_direction_nodata = flow_direction_info['nodata'][0]
|
||||
cdef int n_cols, n_rows
|
||||
n_cols, n_rows = flow_direction_info['raster_size']
|
||||
|
||||
cdef _ManagedRaster mfd_flow_direction_raster = _ManagedRaster(
|
||||
mfd_flow_direction_path, 1, False)
|
||||
|
||||
cdef _ManagedRaster average_aspect_raster = _ManagedRaster(
|
||||
target_average_aspect_path, 1, True)
|
||||
|
||||
cdef int seed_row = 0
|
||||
cdef int seed_col = 0
|
||||
cdef int n_pixels_visited = 0
|
||||
cdef int win_xsize, win_ysize, xoff, yoff
|
||||
cdef int row_index, col_index, neighbor_index
|
||||
cdef int flow_weight_in_direction
|
||||
cdef int weight_sum
|
||||
cdef int seed_flow_value
|
||||
cdef float aspect_weighted_average, aspect_weighted_sum
|
||||
|
||||
# the flow_lengths array is the functional equivalent
|
||||
# of calculating |sin(alpha)| + |cos(alpha)|.
|
||||
cdef float* flow_lengths = [
|
||||
1, <float>SQRT2,
|
||||
1, <float>SQRT2,
|
||||
1, <float>SQRT2,
|
||||
1, <float>SQRT2
|
||||
]
|
||||
|
||||
# Loop over iterblocks to maintain cache locality
|
||||
# Find each non-nodata pixel and calculate proportional flow
|
||||
# Multiply proportional flow times the flow length x_d
|
||||
# write the final value to the raster.
|
||||
for offset_dict in pygeoprocessing.iterblocks(
|
||||
(mfd_flow_direction_path, 1), offset_only=True, largest_block=0):
|
||||
win_xsize = offset_dict['win_xsize']
|
||||
win_ysize = offset_dict['win_ysize']
|
||||
xoff = offset_dict['xoff']
|
||||
yoff = offset_dict['yoff']
|
||||
|
||||
LOGGER.info('Average aspect %.2f%% complete', 100 * (
|
||||
n_pixels_visited / float(n_cols * n_rows)))
|
||||
|
||||
for row_index in range(win_ysize):
|
||||
seed_row = yoff + row_index
|
||||
for col_index in range(win_xsize):
|
||||
seed_col = xoff + col_index
|
||||
seed_flow_value = <int>mfd_flow_direction_raster.get(
|
||||
seed_col, seed_row)
|
||||
|
||||
# Skip this seed if it's nodata (Currently expected to be 0).
|
||||
# No need to set the nodata value here since we have already
|
||||
# filled the raster with nodata values at creation time.
|
||||
if seed_flow_value == mfd_flow_direction_nodata:
|
||||
continue
|
||||
|
||||
weight_sum = 0
|
||||
aspect_weighted_sum = 0
|
||||
for neighbor_index in range(8):
|
||||
neighbor_row = seed_row + ROW_OFFSETS[neighbor_index]
|
||||
if neighbor_row == -1 or neighbor_row == n_rows:
|
||||
continue
|
||||
|
||||
neighbor_col = seed_col + COL_OFFSETS[neighbor_index]
|
||||
if neighbor_col == -1 or neighbor_col == n_cols:
|
||||
continue
|
||||
|
||||
flow_weight_in_direction = (seed_flow_value >> (
|
||||
neighbor_index * 4) & 0xF)
|
||||
weight_sum += flow_weight_in_direction
|
||||
|
||||
aspect_weighted_sum += (
|
||||
flow_lengths[neighbor_index] *
|
||||
flow_weight_in_direction)
|
||||
|
||||
# Weight sum should never be less than 0.
|
||||
# Since it's an int, we can compare it directly against the
|
||||
# value of 0.
|
||||
if weight_sum == 0:
|
||||
aspect_weighted_average = average_aspect_nodata
|
||||
else:
|
||||
# We already know that weight_sum will be > 0 because we
|
||||
# check for it in the condition above.
|
||||
with cython.cdivision(True):
|
||||
aspect_weighted_average = (
|
||||
aspect_weighted_sum / <float>weight_sum)
|
||||
|
||||
average_aspect_raster.set(
|
||||
seed_col, seed_row, aspect_weighted_average)
|
||||
|
||||
n_pixels_visited += win_xsize * win_ysize
|
||||
|
||||
LOGGER.info('Average aspect 100.00% complete')
|
||||
|
||||
mfd_flow_direction_raster.close()
|
||||
average_aspect_raster.close()
|
||||
|
|
|
@ -413,10 +413,62 @@ MODEL_SPEC = {
|
|||
"bands": {1: {
|
||||
"type": "integer"
|
||||
}}
|
||||
},
|
||||
'Si.tif': {
|
||||
"about": gettext("Map of the S_i factor derived from CN"),
|
||||
"bands": {1: {"type": "number", "units": u.inch}}
|
||||
},
|
||||
'lulc_aligned.tif': {
|
||||
"about": gettext("Copy of LULC input, aligned and clipped "
|
||||
"to match the other spatial inputs"),
|
||||
"bands": {1: {"type": "integer"}}
|
||||
},
|
||||
'dem_aligned.tif': {
|
||||
"about": gettext("Copy of DEM input, aligned and clipped "
|
||||
"to match the other spatial inputs"),
|
||||
"bands": {1: {"type": "number", "units": u.meter}}
|
||||
},
|
||||
'pit_filled_dem.tif': {
|
||||
"about": gettext("Pit filled DEM"),
|
||||
"bands": {1: {"type": "number", "units": u.meter}}
|
||||
},
|
||||
'soil_group_aligned.tif': {
|
||||
"about": gettext("Copy of soil groups input, aligned and "
|
||||
"clipped to match the other spatial inputs"),
|
||||
"bands": {1: {"type": "integer"}}
|
||||
},
|
||||
'flow_accum.tif': spec_utils.FLOW_ACCUMULATION,
|
||||
'prcp_a[MONTH].tif': {
|
||||
"bands": {1: {"type": "number", "units": u.millimeter/u.year}},
|
||||
"about": gettext("Monthly precipitation rasters, aligned and "
|
||||
"clipped to match the other spatial inputs")
|
||||
},
|
||||
'n_events[MONTH].tif': {
|
||||
"about": gettext("Map of monthly rain events"),
|
||||
"bands": {1: {"type": "integer"}}
|
||||
},
|
||||
'et0_a[MONTH].tif': {
|
||||
"bands": {1: {"type": "number", "units": u.millimeter}},
|
||||
"about": gettext("Monthly ET0 rasters, aligned and "
|
||||
"clipped to match the other spatial inputs")
|
||||
},
|
||||
'kc_[MONTH].tif': {
|
||||
"about": gettext("Map of monthly KC values"),
|
||||
"bands": {1: {"type": "number", "units": u.none}}
|
||||
},
|
||||
'l_aligned.tif': {
|
||||
"about": gettext("Copy of user-defined local recharge input, "
|
||||
"aligned and clipped to match the other spatial inputs"),
|
||||
"bands": {1: {"type": "number", "units": u.millimeter}}
|
||||
},
|
||||
'cz_aligned.tif': {
|
||||
"about": gettext("Copy of user-defined climate zones raster, "
|
||||
"aligned and clipped to match the other spatial inputs"),
|
||||
"bands": {1: {"type": "integer"}}
|
||||
}
|
||||
}
|
||||
},
|
||||
"cache_dir": spec_utils.TASKGRAPH_DIR
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -441,18 +493,10 @@ _INTERMEDIATE_BASE_FILES = {
|
|||
'flow_dir_mfd_path': 'flow_dir_mfd.tif',
|
||||
'qfm_path_list': ['qf_%d.tif' % (x+1) for x in range(N_MONTHS)],
|
||||
'stream_path': 'stream.tif',
|
||||
}
|
||||
|
||||
_TMP_BASE_FILES = {
|
||||
'outflow_direction_path': 'outflow_direction.tif',
|
||||
'outflow_weights_path': 'outflow_weights.tif',
|
||||
'kc_path': 'kc.tif',
|
||||
'si_path': 'Si.tif',
|
||||
'lulc_aligned_path': 'lulc_aligned.tif',
|
||||
'dem_aligned_path': 'dem_aligned.tif',
|
||||
'dem_pit_filled_path': 'pit_filled_dem.tif',
|
||||
'loss_path': 'loss.tif',
|
||||
'zero_absorption_source_path': 'zero_absorption.tif',
|
||||
'soil_group_aligned_path': 'soil_group_aligned.tif',
|
||||
'flow_accum_path': 'flow_accum.tif',
|
||||
'precip_path_aligned_list': ['prcp_a%d.tif' % x for x in range(N_MONTHS)],
|
||||
|
@ -461,7 +505,6 @@ _TMP_BASE_FILES = {
|
|||
'kc_path_list': ['kc_%d.tif' % x for x in range(N_MONTHS)],
|
||||
'l_aligned_path': 'l_aligned.tif',
|
||||
'cz_aligned_raster_path': 'cz_aligned.tif',
|
||||
'l_sum_pre_clamp': 'l_sum_pre_clamp.tif'
|
||||
}
|
||||
|
||||
|
||||
|
@ -593,9 +636,8 @@ def _execute(args):
|
|||
file_suffix = utils.make_suffix_string(args, 'results_suffix')
|
||||
intermediate_output_dir = os.path.join(
|
||||
args['workspace_dir'], 'intermediate_outputs')
|
||||
cache_dir = os.path.join(args['workspace_dir'], 'cache_dir')
|
||||
output_dir = args['workspace_dir']
|
||||
utils.make_directories([intermediate_output_dir, cache_dir, output_dir])
|
||||
utils.make_directories([intermediate_output_dir, output_dir])
|
||||
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
|
@ -605,13 +647,13 @@ def _execute(args):
|
|||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # Synchronous mode.
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
cache_dir, n_workers, reporting_interval=5)
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'),
|
||||
n_workers, reporting_interval=5)
|
||||
|
||||
LOGGER.info('Building file registry')
|
||||
file_registry = utils.build_file_registry(
|
||||
[(_OUTPUT_BASE_FILES, output_dir),
|
||||
(_INTERMEDIATE_BASE_FILES, intermediate_output_dir),
|
||||
(_TMP_BASE_FILES, cache_dir)], file_suffix)
|
||||
(_INTERMEDIATE_BASE_FILES, intermediate_output_dir)], file_suffix)
|
||||
|
||||
LOGGER.info('Checking that the AOI is not the output aggregate vector')
|
||||
if (os.path.normpath(args['aoi_path']) ==
|
||||
|
@ -689,7 +731,7 @@ def _execute(args):
|
|||
args=(
|
||||
(file_registry['dem_aligned_path'], 1),
|
||||
file_registry['dem_pit_filled_path']),
|
||||
kwargs={'working_dir': cache_dir},
|
||||
kwargs={'working_dir': intermediate_output_dir},
|
||||
target_path_list=[file_registry['dem_pit_filled_path']],
|
||||
dependent_task_list=[align_task],
|
||||
task_name='fill dem pits')
|
||||
|
@ -699,7 +741,7 @@ def _execute(args):
|
|||
args=(
|
||||
(file_registry['dem_pit_filled_path'], 1),
|
||||
file_registry['flow_dir_mfd_path']),
|
||||
kwargs={'working_dir': cache_dir},
|
||||
kwargs={'working_dir': intermediate_output_dir},
|
||||
target_path_list=[file_registry['flow_dir_mfd_path']],
|
||||
dependent_task_list=[fill_pit_task],
|
||||
task_name='flow dir mfd')
|
||||
|
|
|
@ -364,10 +364,10 @@ MODEL_SPEC = {
|
|||
"calculated by convolving the search kernel with the "
|
||||
"retention ratio raster."),
|
||||
"bands": {1: {"type": "ratio"}}
|
||||
},
|
||||
"cache_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -439,14 +439,14 @@ def execute(args):
|
|||
suffix = utils.make_suffix_string(args, 'results_suffix')
|
||||
output_dir = args['workspace_dir']
|
||||
intermediate_dir = os.path.join(output_dir, 'intermediate')
|
||||
cache_dir = os.path.join(intermediate_dir, 'cache_dir')
|
||||
utils.make_directories(
|
||||
[args['workspace_dir'], intermediate_dir, cache_dir])
|
||||
utils.make_directories([args['workspace_dir'], intermediate_dir])
|
||||
files = utils.build_file_registry(
|
||||
[(INTERMEDIATE_OUTPUTS, intermediate_dir),
|
||||
(FINAL_OUTPUTS, output_dir)], suffix)
|
||||
|
||||
task_graph = taskgraph.TaskGraph(cache_dir, int(args.get('n_workers', -1)))
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'),
|
||||
int(args.get('n_workers', -1)))
|
||||
|
||||
# get the necessary base raster info
|
||||
source_lulc_raster_info = pygeoprocessing.get_raster_info(
|
||||
|
|
|
@ -342,10 +342,10 @@ MODEL_SPEC = {
|
|||
"reference of the LULC."),
|
||||
"geometries": spec_utils.POLYGONS,
|
||||
"fields": {}
|
||||
},
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -456,7 +456,7 @@ def execute(args):
|
|||
n_workers = -1 # Synchronous mode.
|
||||
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
os.path.join(intermediate_dir, '_taskgraph_working_dir'), n_workers)
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'), n_workers)
|
||||
|
||||
# align all the input rasters.
|
||||
aligned_lulc_raster_path = os.path.join(
|
||||
|
|
|
@ -186,12 +186,7 @@ MODEL_SPEC = {
|
|||
"the same spatial reference as the LULC."),
|
||||
"geometries": spec_utils.POLYGONS,
|
||||
"fields": {}
|
||||
}
|
||||
}
|
||||
},
|
||||
"temp_working_dir_not_for_humans": {
|
||||
"type": "directory",
|
||||
"contents": {
|
||||
},
|
||||
"aligned_lulc.tif": {
|
||||
"about": "Aligned and clipped copy of the LULC.",
|
||||
"bands": {1: {"type": "integer"}}
|
||||
|
@ -207,10 +202,10 @@ MODEL_SPEC = {
|
|||
"s_max.tif": {
|
||||
"about": "Map of potential retention.",
|
||||
"bands": {1: {"type": "number", "units": u.millimeter}}
|
||||
},
|
||||
"taskgraph_data.db": {}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -257,12 +252,10 @@ def execute(args):
|
|||
"""
|
||||
file_suffix = utils.make_suffix_string(args, 'results_suffix')
|
||||
|
||||
temporary_working_dir = os.path.join(
|
||||
args['workspace_dir'], 'temp_working_dir_not_for_humans')
|
||||
intermediate_dir = os.path.join(
|
||||
args['workspace_dir'], 'intermediate_files')
|
||||
utils.make_directories([
|
||||
args['workspace_dir'], intermediate_dir, temporary_working_dir])
|
||||
args['workspace_dir'], intermediate_dir])
|
||||
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
|
@ -271,13 +264,14 @@ def execute(args):
|
|||
# ValueError when n_workers is an empty string.
|
||||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # Synchronous mode.
|
||||
task_graph = taskgraph.TaskGraph(temporary_working_dir, n_workers)
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'), n_workers)
|
||||
|
||||
# Align LULC with soils
|
||||
aligned_lulc_path = os.path.join(
|
||||
temporary_working_dir, f'aligned_lulc{file_suffix}.tif')
|
||||
intermediate_dir, f'aligned_lulc{file_suffix}.tif')
|
||||
aligned_soils_path = os.path.join(
|
||||
temporary_working_dir,
|
||||
intermediate_dir,
|
||||
f'aligned_soils_hydrological_group{file_suffix}.tif')
|
||||
|
||||
lulc_raster_info = pygeoprocessing.get_raster_info(
|
||||
|
@ -325,7 +319,7 @@ def execute(args):
|
|||
soil_type_nodata = soil_raster_info['nodata'][0]
|
||||
|
||||
cn_raster_path = os.path.join(
|
||||
temporary_working_dir, f'cn_raster{file_suffix}.tif')
|
||||
intermediate_dir, f'cn_raster{file_suffix}.tif')
|
||||
align_raster_stack_task.join()
|
||||
|
||||
cn_raster_task = task_graph.add_task(
|
||||
|
@ -342,7 +336,7 @@ def execute(args):
|
|||
# Generate S_max
|
||||
s_max_nodata = -9999
|
||||
s_max_raster_path = os.path.join(
|
||||
temporary_working_dir, f's_max{file_suffix}.tif')
|
||||
intermediate_dir, f's_max{file_suffix}.tif')
|
||||
s_max_task = task_graph.add_task(
|
||||
func=pygeoprocessing.raster_calculator,
|
||||
args=(
|
||||
|
@ -939,5 +933,25 @@ def validate(args, limit_to=None):
|
|||
be an empty list if validation succeeds.
|
||||
|
||||
"""
|
||||
return validation.validate(args, MODEL_SPEC['args'],
|
||||
MODEL_SPEC['args_with_spatial_overlap'])
|
||||
validation_warnings = validation.validate(
|
||||
args, MODEL_SPEC['args'], MODEL_SPEC['args_with_spatial_overlap'])
|
||||
|
||||
sufficient_keys = validation.get_sufficient_keys(args)
|
||||
invalid_keys = validation.get_invalid_keys(validation_warnings)
|
||||
|
||||
if ("curve_number_table_path" not in invalid_keys and
|
||||
"curve_number_table_path" in sufficient_keys):
|
||||
# Load CN table. Resulting DF has index and CN_X columns only.
|
||||
cn_df = utils.read_csv_to_dataframe(
|
||||
args['curve_number_table_path'],
|
||||
MODEL_SPEC['args']['curve_number_table_path'])
|
||||
# Check for NaN values.
|
||||
nan_mask = cn_df.isna()
|
||||
if nan_mask.any(axis=None):
|
||||
nan_lucodes = nan_mask[nan_mask.any(axis=1)].index
|
||||
lucode_list = list(nan_lucodes.values)
|
||||
validation_warnings.append((
|
||||
['curve_number_table_path'],
|
||||
f'Missing curve numbers for lucode(s) {lucode_list}'))
|
||||
|
||||
return validation_warnings
|
||||
|
|
|
@ -46,7 +46,6 @@ MESSAGES = {
|
|||
'NOT_GDAL_RASTER': gettext('File could not be opened as a GDAL raster'),
|
||||
'OVR_FILE': gettext('File found to be an overview ".ovr" file.'),
|
||||
'NOT_GDAL_VECTOR': gettext('File could not be opened as a GDAL vector'),
|
||||
'NOT_CSV_OR_EXCEL': gettext('File could not be opened as a CSV or Excel file.'),
|
||||
'NOT_CSV': gettext('File could not be opened as a CSV. File must be encoded as '
|
||||
'a UTF-8 CSV.'),
|
||||
'REGEXP_MISMATCH': gettext("Value did not match expected pattern {regexp}"),
|
||||
|
@ -56,7 +55,8 @@ MESSAGES = {
|
|||
'NOT_AN_INTEGER': gettext('Value "{value}" does not represent an integer'),
|
||||
'NOT_BOOLEAN': gettext("Value must be either True or False, not {value}"),
|
||||
'NO_PROJECTION': gettext('Spatial file {filepath} has no projection'),
|
||||
'BBOX_NOT_INTERSECT': gettext("Bounding boxes do not intersect: {bboxes}"),
|
||||
'BBOX_NOT_INTERSECT': gettext('Not all of the spatial layers overlap each '
|
||||
'other. All bounding boxes must intersect: {bboxes}'),
|
||||
'NEED_PERMISSION': gettext('You must have {permission} access to this file'),
|
||||
}
|
||||
|
||||
|
@ -542,7 +542,7 @@ def check_boolean(value, **kwargs):
|
|||
return MESSAGES['NOT_BOOLEAN'].format(value=value)
|
||||
|
||||
|
||||
def check_csv(filepath, rows=None, columns=None, excel_ok=False, **kwargs):
|
||||
def check_csv(filepath, rows=None, columns=None, **kwargs):
|
||||
"""Validate a table.
|
||||
|
||||
Args:
|
||||
|
@ -555,8 +555,6 @@ def check_csv(filepath, rows=None, columns=None, excel_ok=False, **kwargs):
|
|||
exist in the first row of the table. See the docstring of
|
||||
``check_headers`` for details on validation rules. No more than one
|
||||
of `rows` and `columns` should be defined.
|
||||
excel_ok=False (boolean): Whether it's OK for the file to be an Excel
|
||||
table. This is not a common case.
|
||||
|
||||
Returns:
|
||||
A string error message if an error was found. ``None`` otherwise.
|
||||
|
@ -578,13 +576,7 @@ def check_csv(filepath, rows=None, columns=None, excel_ok=False, **kwargs):
|
|||
filepath, sep=None, engine='python', encoding=encoding,
|
||||
header=None)
|
||||
except Exception:
|
||||
if excel_ok:
|
||||
try:
|
||||
dataframe = pandas.read_excel(filepath)
|
||||
except ValueError:
|
||||
return MESSAGES['NOT_CSV_OR_EXCEL']
|
||||
else:
|
||||
return MESSAGES['NOT_CSV']
|
||||
return MESSAGES['NOT_CSV']
|
||||
|
||||
# assume that at most one of `rows` and `columns` is defined
|
||||
if columns:
|
||||
|
|
|
@ -600,10 +600,10 @@ MODEL_SPEC = {
|
|||
"LandPts.txt": {
|
||||
"created_if": "valuation_container",
|
||||
"about": "This text file logs records of the landing point coordinates."
|
||||
},
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -721,8 +721,6 @@ def execute(args):
|
|||
utils.make_directories([intermediate_dir, output_dir])
|
||||
|
||||
# Initialize a TaskGraph
|
||||
taskgraph_working_dir = os.path.join(
|
||||
intermediate_dir, '_taskgraph_working_dir')
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
except (KeyError, ValueError, TypeError):
|
||||
|
@ -730,7 +728,8 @@ def execute(args):
|
|||
# ValueError when n_workers is an empty string.
|
||||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # single process mode.
|
||||
task_graph = taskgraph.TaskGraph(taskgraph_working_dir, n_workers)
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'), n_workers)
|
||||
|
||||
# Append a _ to the suffix if it's not empty and doesn't already have one
|
||||
file_suffix = utils.make_suffix_string(args, 'results_suffix')
|
||||
|
|
|
@ -526,10 +526,10 @@ MODEL_SPEC = {
|
|||
"about": "Wind data",
|
||||
"geometries": spec_utils.POINT,
|
||||
"fields": OUTPUT_WIND_DATA_FIELDS
|
||||
},
|
||||
"_taskgraph_working_dir": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"taskgraph_cache": spec_utils.TASKGRAPH_DIR
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -652,7 +652,6 @@ def execute(args):
|
|||
suffix = utils.make_suffix_string(args, 'results_suffix')
|
||||
|
||||
# Initialize a TaskGraph
|
||||
taskgraph_working_dir = os.path.join(inter_dir, '_taskgraph_working_dir')
|
||||
try:
|
||||
n_workers = int(args['n_workers'])
|
||||
except (KeyError, ValueError, TypeError):
|
||||
|
@ -660,7 +659,8 @@ def execute(args):
|
|||
# ValueError when n_workers is an empty string.
|
||||
# TypeError when n_workers is None.
|
||||
n_workers = -1 # single process mode.
|
||||
task_graph = taskgraph.TaskGraph(taskgraph_working_dir, n_workers)
|
||||
task_graph = taskgraph.TaskGraph(
|
||||
os.path.join(args['workspace_dir'], 'taskgraph_cache'), n_workers)
|
||||
|
||||
# Resample the bathymetry raster if it does not have square pixel size
|
||||
try:
|
||||
|
@ -2623,10 +2623,12 @@ def _clip_vector_by_vector(
|
|||
shutil.rmtree(temp_dir, ignore_errors=True)
|
||||
|
||||
if empty_clip:
|
||||
# The "clip_vector_path" is always the AOI.
|
||||
raise ValueError(
|
||||
f"Clipping {base_vector_path} by {clip_vector_path} returned 0"
|
||||
" features. If an AOI was provided this could mean the AOI and"
|
||||
" Wind Data do not intersect spatially.")
|
||||
f" features. This means the AOI and {base_vector_path} do not"
|
||||
" intersect spatially. Please check that the AOI has spatial"
|
||||
" overlap with all input data.")
|
||||
|
||||
LOGGER.info('Finished _clip_vector_by_vector')
|
||||
|
||||
|
|
|
@ -877,8 +877,18 @@ class TestCBC2(unittest.TestCase):
|
|||
# analysis year must be >= the last transition year.
|
||||
args['analysis_year'] = baseline_year
|
||||
|
||||
# Write invalid entries to landcover transition table
|
||||
with open(args['landcover_transitions_table'], 'w') as transition_table:
|
||||
transition_table.write('lulc-class,Developed,Forest,Water\n')
|
||||
transition_table.write('Developed,NCC,,invalid\n')
|
||||
transition_table.write('Forest,accum,disturb,low-impact-disturb\n')
|
||||
transition_table.write('Water,disturb,med-impact-disturb,high-impact-disturb\n')
|
||||
transition_options = [
|
||||
'accum', 'high-impact-disturb', 'med-impact-disturb',
|
||||
'low-impact-disturb', 'ncc']
|
||||
|
||||
validation_warnings = coastal_blue_carbon.validate(args)
|
||||
self.assertEqual(len(validation_warnings), 2)
|
||||
self.assertEqual(len(validation_warnings), 3)
|
||||
self.assertIn(
|
||||
coastal_blue_carbon.INVALID_SNAPSHOT_RASTER_MSG.format(
|
||||
snapshot_year=baseline_year + 10),
|
||||
|
@ -887,6 +897,11 @@ class TestCBC2(unittest.TestCase):
|
|||
coastal_blue_carbon.INVALID_ANALYSIS_YEAR_MSG.format(
|
||||
analysis_year=2000, latest_year=2010),
|
||||
validation_warnings[1][1])
|
||||
self.assertIn(
|
||||
coastal_blue_carbon.INVALID_TRANSITION_VALUES_MSG.format(
|
||||
model_transitions=transition_options,
|
||||
transition_values=['disturb', 'invalid']),
|
||||
validation_warnings[2][1])
|
||||
|
||||
def test_track_first_disturbance(self):
|
||||
"""CBC: Track disturbances over time."""
|
||||
|
|
|
@ -1934,7 +1934,7 @@ class HabitatQualityTests(unittest.TestCase):
|
|||
self.assertTrue(
|
||||
validate_result,
|
||||
"expected failed validations instead didn't get any.")
|
||||
self.assertIn("Bounding boxes do not intersect", validate_result[0][1])
|
||||
self.assertIn("bounding boxes must intersect", validate_result[0][1])
|
||||
|
||||
def test_habitat_quality_argspec_missing_projection(self):
|
||||
"""Habitat Quality: raise error on missing projection."""
|
||||
|
|
|
@ -407,11 +407,6 @@ class ValidateModelSpecs(unittest.TestCase):
|
|||
attrs.discard('rows')
|
||||
attrs.discard('columns')
|
||||
|
||||
# csv type may optionally have an 'excel_ok' attribute
|
||||
if 'excel_ok' in arg:
|
||||
self.assertIsInstance(arg['excel_ok'], bool)
|
||||
attrs.discard('excel_ok')
|
||||
|
||||
elif t == 'directory':
|
||||
# directory type should have a contents property that maps each
|
||||
# expected path name/pattern within the directory to a nested
|
||||
|
|
|
@ -170,7 +170,6 @@ class NDRTests(unittest.TestCase):
|
|||
|
||||
# use predefined directory so test can clean up files during teardown
|
||||
args = NDRTests.generate_base_args(self.workspace_dir)
|
||||
# make args explicit that this is a base run of SWY
|
||||
args['biophysical_table_path'] = os.path.join(
|
||||
REGRESSION_DATA, 'input', 'biophysical_table_missing_lucode.csv')
|
||||
with self.assertRaises(KeyError) as cm:
|
||||
|
@ -186,7 +185,6 @@ class NDRTests(unittest.TestCase):
|
|||
|
||||
# use predefined directory so test can clean up files during teardown
|
||||
args = NDRTests.generate_base_args(self.workspace_dir)
|
||||
# make args explicit that this is a base run of SWY
|
||||
args['calc_n'] = False
|
||||
args['calc_p'] = False
|
||||
validation_messages = ndr.validate(args)
|
||||
|
@ -209,8 +207,6 @@ class NDRTests(unittest.TestCase):
|
|||
os.path.join(self.workspace_dir, 'watershed_results_ndr.gpkg'),
|
||||
'wb') as f:
|
||||
f.write(b'')
|
||||
|
||||
# make args explicit that this is a base run of SWY
|
||||
ndr.execute(args)
|
||||
|
||||
result_vector = ogr.Open(os.path.join(
|
||||
|
@ -247,6 +243,53 @@ class NDRTests(unittest.TestCase):
|
|||
args['workspace_dir'], 'intermediate_outputs',
|
||||
'what_drains_to_stream.tif')))
|
||||
|
||||
def test_regression_undefined_nodata(self):
|
||||
"""NDR test when DEM, LULC and runoff proxy have undefined nodata."""
|
||||
from natcap.invest.ndr import ndr
|
||||
|
||||
# use predefined directory so test can clean up files during teardown
|
||||
args = NDRTests.generate_base_args(self.workspace_dir)
|
||||
|
||||
# unset nodata values for DEM, LULC, and runoff proxy
|
||||
# this is ok because the test data is 100% valid
|
||||
# regression test for https://github.com/natcap/invest/issues/1005
|
||||
for key in ['runoff_proxy_path', 'dem_path', 'lulc_path']:
|
||||
target_path = os.path.join(self.workspace_dir, f'{key}_no_nodata.tif')
|
||||
source = gdal.OpenEx(args[key], gdal.OF_RASTER)
|
||||
driver = gdal.GetDriverByName('GTIFF')
|
||||
target = driver.CreateCopy(target_path, source)
|
||||
target.GetRasterBand(1).DeleteNoDataValue()
|
||||
source, target = None, None
|
||||
args[key] = target_path
|
||||
|
||||
ndr.execute(args)
|
||||
|
||||
result_vector = ogr.Open(os.path.join(
|
||||
args['workspace_dir'], 'watershed_results_ndr.gpkg'))
|
||||
result_layer = result_vector.GetLayer()
|
||||
result_feature = result_layer.GetFeature(1)
|
||||
result_layer = None
|
||||
result_vector = None
|
||||
mismatch_list = []
|
||||
# these values were generated by manual inspection of regression
|
||||
# results
|
||||
for field, expected_value in [
|
||||
('p_surface_load', 41.921860),
|
||||
('p_surface_export', 5.899117),
|
||||
('n_surface_load', 2978.519775),
|
||||
('n_surface_export', 289.0498),
|
||||
('n_subsurface_load', 28.614094),
|
||||
('n_subsurface_export', 15.61077),
|
||||
('n_total_export', 304.660614)]:
|
||||
val = result_feature.GetField(field)
|
||||
if not numpy.isclose(val, expected_value):
|
||||
mismatch_list.append(
|
||||
(field, 'expected: %f' % expected_value,
|
||||
'actual: %f' % val))
|
||||
result_feature = None
|
||||
if mismatch_list:
|
||||
raise RuntimeError("results not expected: %s" % mismatch_list)
|
||||
|
||||
def test_validation(self):
|
||||
"""NDR test argument validation."""
|
||||
from natcap.invest import validation
|
||||
|
|
|
@ -581,14 +581,14 @@ class TestRecServer(unittest.TestCase):
|
|||
expected_grid_vector_path = os.path.join(
|
||||
REGRESSION_DATA, 'predictor_data_all_metrics.shp')
|
||||
utils._assert_vectors_equal(
|
||||
out_grid_vector_path, expected_grid_vector_path, 1e-3)
|
||||
expected_grid_vector_path, out_grid_vector_path, 1e-3)
|
||||
|
||||
out_scenario_path = os.path.join(
|
||||
args['workspace_dir'], 'scenario_results.shp')
|
||||
expected_scenario_path = os.path.join(
|
||||
REGRESSION_DATA, 'scenario_results_all_metrics.shp')
|
||||
utils._assert_vectors_equal(
|
||||
out_scenario_path, expected_scenario_path, 1e-3)
|
||||
expected_scenario_path, out_scenario_path, 1e-3)
|
||||
|
||||
def test_results_suffix_on_serverside_files(self):
|
||||
"""Recreation test suffix gets added to files created on server."""
|
||||
|
@ -920,7 +920,7 @@ class RecreationRegressionTests(unittest.TestCase):
|
|||
REGRESSION_DATA, 'square_grid_vector_path.shp')
|
||||
|
||||
utils._assert_vectors_equal(
|
||||
out_grid_vector_path, expected_grid_vector_path)
|
||||
expected_grid_vector_path, out_grid_vector_path)
|
||||
|
||||
def test_hex_grid(self):
|
||||
"""Recreation hex grid regression test."""
|
||||
|
@ -937,7 +937,7 @@ class RecreationRegressionTests(unittest.TestCase):
|
|||
REGRESSION_DATA, 'hex_grid_vector_path.shp')
|
||||
|
||||
utils._assert_vectors_equal(
|
||||
out_grid_vector_path, expected_grid_vector_path)
|
||||
expected_grid_vector_path, out_grid_vector_path)
|
||||
|
||||
@unittest.skip("skipping to avoid remote server call (issue #3753)")
|
||||
def test_no_grid_execute(self):
|
||||
|
@ -1002,7 +1002,7 @@ class RecreationRegressionTests(unittest.TestCase):
|
|||
REGRESSION_DATA, 'hex_grid_vector_path.shp')
|
||||
|
||||
utils._assert_vectors_equal(
|
||||
out_grid_vector_path, expected_grid_vector_path)
|
||||
expected_grid_vector_path, out_grid_vector_path)
|
||||
|
||||
def test_existing_regression_coef(self):
|
||||
"""Recreation test regression coefficients handle existing output."""
|
||||
|
@ -1053,7 +1053,7 @@ class RecreationRegressionTests(unittest.TestCase):
|
|||
REGRESSION_DATA, 'test_regression_coefficients.shp')
|
||||
|
||||
utils._assert_vectors_equal(
|
||||
out_coefficient_vector_path, expected_coeff_vector_path, 1e-6)
|
||||
expected_coeff_vector_path, out_coefficient_vector_path, 1e-6)
|
||||
|
||||
def test_predictor_table_absolute_paths(self):
|
||||
"""Recreation test validation from full path."""
|
||||
|
|
|
@ -141,11 +141,11 @@ class SDRTests(unittest.TestCase):
|
|||
|
||||
sdr.execute(args)
|
||||
expected_results = {
|
||||
'usle_tot': 13.90210914612,
|
||||
'sed_export': 0.55185163021,
|
||||
'sed_dep': 8.80130577087,
|
||||
'avoid_exp': 57971.87890625,
|
||||
'avoid_eros': 1458232.5,
|
||||
'usle_tot': 2.62457418442,
|
||||
'sed_export': 0.09748090804,
|
||||
'sed_dep': 1.71672844887,
|
||||
'avoid_exp': 10199.7490234375,
|
||||
'avoid_eros': 274510.75,
|
||||
}
|
||||
|
||||
vector_path = os.path.join(
|
||||
|
@ -213,10 +213,10 @@ class SDRTests(unittest.TestCase):
|
|||
|
||||
sdr.execute(args)
|
||||
expected_results = {
|
||||
'sed_export': 0.55185163021,
|
||||
'usle_tot': 13.90210914612,
|
||||
'avoid_exp': 57971.87890625,
|
||||
'avoid_eros': 1458232.5,
|
||||
'sed_export': 0.09748090804,
|
||||
'usle_tot': 2.62457418442,
|
||||
'avoid_exp': 10199.7490234375,
|
||||
'avoid_eros': 274510.75,
|
||||
}
|
||||
|
||||
vector_path = os.path.join(
|
||||
|
@ -238,10 +238,10 @@ class SDRTests(unittest.TestCase):
|
|||
sdr.execute(args)
|
||||
|
||||
expected_results = {
|
||||
'sed_export': 0.67064666748,
|
||||
'usle_tot': 12.6965303421,
|
||||
'avoid_exp': 69130.8203125,
|
||||
'avoid_eros': 1317588.375,
|
||||
'sed_export': 0.08896198869,
|
||||
'usle_tot': 1.86480903625,
|
||||
'avoid_exp': 9204.283203125,
|
||||
'avoid_eros': 194613.28125,
|
||||
}
|
||||
|
||||
vector_path = os.path.join(
|
||||
|
@ -264,10 +264,10 @@ class SDRTests(unittest.TestCase):
|
|||
sdr.execute(args)
|
||||
|
||||
expected_results = {
|
||||
'sed_export': 0.97192692757,
|
||||
'usle_tot': 12.68887424469,
|
||||
'avoid_exp': 100960.9609375,
|
||||
'avoid_eros': 1329122.0,
|
||||
'sed_export': 0.17336219549,
|
||||
'usle_tot': 2.56186032295,
|
||||
'avoid_exp': 17980.52734375,
|
||||
'avoid_eros': 267931.71875,
|
||||
}
|
||||
|
||||
vector_path = os.path.join(
|
||||
|
@ -391,3 +391,43 @@ class SDRTests(unittest.TestCase):
|
|||
what_drains = pygeoprocessing.raster_to_numpy_array(
|
||||
target_what_drains_path)
|
||||
numpy.testing.assert_allclose(what_drains, expected_drainage)
|
||||
|
||||
def test_ls_factor(self):
|
||||
"""SDR test for our LS Factor function."""
|
||||
from natcap.invest.sdr import sdr
|
||||
|
||||
nodata = -1
|
||||
|
||||
# These varying percent slope values should cover all of the slope
|
||||
# factor and slope table cases.
|
||||
pct_slope_array = numpy.array(
|
||||
[[1.5, 4, 8, 10, 15, nodata]], dtype=numpy.float32)
|
||||
flow_accum_array = numpy.array(
|
||||
[[100, 100, 100, 100, 10000000, nodata]], dtype=numpy.float32)
|
||||
l_max = 25 # affects the last item in the array only
|
||||
|
||||
srs = osr.SpatialReference()
|
||||
srs.ImportFromEPSG(26910) # NAD83 / UTM zone 11N
|
||||
srs_wkt = srs.ExportToWkt()
|
||||
origin = (463250, 4929700)
|
||||
pixel_size = (30, -30)
|
||||
|
||||
pct_slope_path = os.path.join(self.workspace_dir, 'pct_slope.tif')
|
||||
pygeoprocessing.numpy_array_to_raster(
|
||||
pct_slope_array, nodata, pixel_size, origin, srs_wkt,
|
||||
pct_slope_path)
|
||||
|
||||
flow_accum_path = os.path.join(self.workspace_dir, 'flow_accum.tif')
|
||||
pygeoprocessing.numpy_array_to_raster(
|
||||
flow_accum_array, nodata, pixel_size, origin, srs_wkt,
|
||||
flow_accum_path)
|
||||
|
||||
target_ls_factor_path = os.path.join(self.workspace_dir, 'ls.tif')
|
||||
sdr._calculate_ls_factor(flow_accum_path, pct_slope_path, l_max,
|
||||
target_ls_factor_path)
|
||||
|
||||
ls = pygeoprocessing.raster_to_numpy_array(target_ls_factor_path)
|
||||
expected_ls = numpy.array(
|
||||
[[0.253996, 0.657229, 1.345856, 1.776729, 49.802994, -1]],
|
||||
dtype=numpy.float32)
|
||||
numpy.testing.assert_allclose(ls, expected_ls, rtol=1e-6)
|
||||
|
|
|
@ -353,3 +353,17 @@ class UFRMTests(unittest.TestCase):
|
|||
[(['curve_number_table_path'],
|
||||
validation.MESSAGES['MATCHED_NO_HEADERS'].format(
|
||||
header='column', header_name='cn_a'))])
|
||||
|
||||
# test missing CN_X values raise warnings
|
||||
args = self._make_args()
|
||||
cn_table = pandas.read_csv(args['curve_number_table_path'])
|
||||
cn_table.at[0, 'CN_A'] = numpy.nan
|
||||
new_cn_path = os.path.join(
|
||||
self.workspace_dir, 'cn_missing_value_table.csv')
|
||||
cn_table.to_csv(new_cn_path, index=False)
|
||||
args['curve_number_table_path'] = new_cn_path
|
||||
result = urban_flood_risk_mitigation.validate(args)
|
||||
self.assertEqual(
|
||||
result,
|
||||
[(['curve_number_table_path'],
|
||||
'Missing curve numbers for lucode(s) [0]')])
|
||||
|
|
|
@ -85,7 +85,8 @@ def _build_model_args(workspace):
|
|||
6,0,100
|
||||
7,1,100
|
||||
8,0,100
|
||||
9,1,100"""))
|
||||
9,1,100
|
||||
"""))
|
||||
|
||||
admin_geom = [
|
||||
shapely.geometry.box(
|
||||
|
@ -342,7 +343,7 @@ class UNATests(unittest.TestCase):
|
|||
from natcap.invest import urban_nature_access
|
||||
|
||||
nodata = urban_nature_access.FLOAT32_NODATA
|
||||
urban_nature_supply = numpy.array([
|
||||
urban_nature_supply_percapita = numpy.array([
|
||||
[nodata, 100.5],
|
||||
[75, 100]], dtype=numpy.float32)
|
||||
urban_nature_demand = 50
|
||||
|
@ -353,7 +354,7 @@ class UNATests(unittest.TestCase):
|
|||
|
||||
urban_nature_budget = (
|
||||
urban_nature_access._urban_nature_balance_percapita_op(
|
||||
urban_nature_supply, urban_nature_demand))
|
||||
urban_nature_supply_percapita, urban_nature_demand))
|
||||
expected_urban_nature_budget = numpy.array([
|
||||
[nodata, 50.5],
|
||||
[25, 50]], dtype=numpy.float32)
|
||||
|
@ -480,6 +481,16 @@ class UNATests(unittest.TestCase):
|
|||
admin_vector = None
|
||||
admin_layer = None
|
||||
|
||||
accessible_urban_nature_array = pygeoprocessing.raster_to_numpy_array(
|
||||
os.path.join(args['workspace_dir'], 'output',
|
||||
'accessible_urban_nature_suffix.tif'))
|
||||
valid_mask = ~utils.array_equals_nodata(
|
||||
accessible_urban_nature_array, urban_nature_access.FLOAT32_NODATA)
|
||||
valid_pixels = accessible_urban_nature_array[valid_mask]
|
||||
self.assertAlmostEqual(numpy.sum(valid_pixels), 6221004.41259766)
|
||||
self.assertAlmostEqual(numpy.min(valid_pixels), 1171.7352294921875)
|
||||
self.assertAlmostEqual(numpy.max(valid_pixels), 11898.0712890625)
|
||||
|
||||
def test_split_urban_nature(self):
|
||||
from natcap.invest import urban_nature_access
|
||||
|
||||
|
@ -532,6 +543,23 @@ class UNATests(unittest.TestCase):
|
|||
admin_vector = None
|
||||
admin_layer = None
|
||||
|
||||
output_dir = os.path.join(args['workspace_dir'], 'output')
|
||||
self._assert_urban_nature(os.path.join(
|
||||
output_dir, 'accessible_urban_nature_lucode_1_suffix.tif'),
|
||||
72000.0, 0.0, 900.0)
|
||||
self._assert_urban_nature(os.path.join(
|
||||
output_dir, 'accessible_urban_nature_lucode_3_suffix.tif'),
|
||||
1034934.9864730835, 0.0, 4431.1650390625)
|
||||
self._assert_urban_nature(os.path.join(
|
||||
output_dir, 'accessible_urban_nature_lucode_5_suffix.tif'),
|
||||
2837622.9519348145, 0.0, 8136.6884765625)
|
||||
self._assert_urban_nature(os.path.join(
|
||||
output_dir, 'accessible_urban_nature_lucode_7_suffix.tif'),
|
||||
8112734.805541992, 2019.2935791015625, 17729.431640625)
|
||||
self._assert_urban_nature(os.path.join(
|
||||
output_dir, 'accessible_urban_nature_lucode_9_suffix.tif'),
|
||||
7744116.974121094, 1567.57958984375, 12863.4619140625)
|
||||
|
||||
def test_split_population(self):
|
||||
"""UNA: test split population optional module.
|
||||
|
||||
|
@ -602,6 +630,36 @@ class UNATests(unittest.TestCase):
|
|||
rtol=1e-6
|
||||
)
|
||||
|
||||
def _assert_urban_nature(self, path, sum_value, min_value, max_value):
|
||||
"""Compare a raster's sum, min and max to given values.
|
||||
|
||||
The raster is assumed to be an accessible urban nature raster.
|
||||
|
||||
Args:
|
||||
path (str): The path to an urban nature raster.
|
||||
sum_value (float): The expected sum of the raster.
|
||||
min_value (float): The expected min of the raster.
|
||||
max_value (float): The expected max of the raster.
|
||||
|
||||
Returns:
|
||||
``None``
|
||||
|
||||
Raises:
|
||||
AssertionError: When the raster's sum, min or max values are not
|
||||
numerically close to the expected values.
|
||||
"""
|
||||
from natcap.invest import urban_nature_access
|
||||
|
||||
accessible_urban_nature_array = (
|
||||
pygeoprocessing.raster_to_numpy_array(path))
|
||||
valid_mask = ~utils.array_equals_nodata(
|
||||
accessible_urban_nature_array,
|
||||
urban_nature_access.FLOAT32_NODATA)
|
||||
valid_pixels = accessible_urban_nature_array[valid_mask]
|
||||
self.assertAlmostEqual(numpy.sum(valid_pixels), sum_value)
|
||||
self.assertAlmostEqual(numpy.min(valid_pixels), min_value)
|
||||
self.assertAlmostEqual(numpy.max(valid_pixels), max_value)
|
||||
|
||||
def test_radii_by_pop_group(self):
|
||||
"""UNA: Test defining radii by population group."""
|
||||
from natcap.invest import urban_nature_access
|
||||
|
@ -666,11 +724,19 @@ class UNATests(unittest.TestCase):
|
|||
self.assertAlmostEqual(
|
||||
expected_value, summary_feature.GetField(fieldname))
|
||||
|
||||
output_dir = os.path.join(args['workspace_dir'], 'output')
|
||||
self._assert_urban_nature(os.path.join(
|
||||
output_dir, 'accessible_urban_nature_to_pop_male.tif'),
|
||||
6221004.412597656, 1171.7352294921875, 11898.0712890625)
|
||||
self._assert_urban_nature(os.path.join(
|
||||
output_dir, 'accessible_urban_nature_to_pop_female.tif'),
|
||||
6221004.412597656, 1171.7352294921875, 11898.0712890625)
|
||||
|
||||
def test_modes_same_radii_same_results(self):
|
||||
"""UNA: all modes have same results when consistent radii.
|
||||
|
||||
Although the different modes have different ways of defining their
|
||||
search radii, the urban_nature_supply raster should be numerically
|
||||
search radii, the urban_nature_supply_percapita raster should be numerically
|
||||
equivalent if they all use the same search radii.
|
||||
|
||||
This is a good gut-check of basic model behavior across modes.
|
||||
|
@ -772,16 +838,19 @@ class UNATests(unittest.TestCase):
|
|||
|
||||
uniform_radius_supply = pygeoprocessing.raster_to_numpy_array(
|
||||
os.path.join(uniform_args['workspace_dir'], 'output',
|
||||
'urban_nature_supply_uniform.tif'))
|
||||
split_urban_nature_supply = pygeoprocessing.raster_to_numpy_array(
|
||||
os.path.join(split_urban_nature_args['workspace_dir'], 'output',
|
||||
'urban_nature_supply_urban_nature.tif'))
|
||||
'urban_nature_supply_percapita_uniform.tif'))
|
||||
split_urban_nature_supply_percapita = (
|
||||
pygeoprocessing.raster_to_numpy_array(
|
||||
os.path.join(
|
||||
split_urban_nature_args['workspace_dir'], 'output',
|
||||
'urban_nature_supply_percapita_urban_nature.tif')))
|
||||
split_pop_groups_supply = pygeoprocessing.raster_to_numpy_array(
|
||||
os.path.join(pop_group_args['workspace_dir'], 'output',
|
||||
'urban_nature_supply_popgroup.tif'))
|
||||
'urban_nature_supply_percapita_popgroup.tif'))
|
||||
|
||||
numpy.testing.assert_allclose(
|
||||
uniform_radius_supply, split_urban_nature_supply, rtol=1e-6)
|
||||
uniform_radius_supply, split_urban_nature_supply_percapita,
|
||||
rtol=1e-6)
|
||||
numpy.testing.assert_allclose(
|
||||
uniform_radius_supply, split_pop_groups_supply, rtol=1e-6)
|
||||
|
||||
|
@ -872,9 +941,112 @@ class UNATests(unittest.TestCase):
|
|||
numpy.testing.assert_allclose(
|
||||
numpy.sum(weighted_sum_array[~nodata_pixels]), 1122.5)
|
||||
|
||||
def test_write_vector(self):
|
||||
"""UNA: test writing of various numeric types to the output vector."""
|
||||
from natcap.invest import urban_nature_access
|
||||
args = _build_model_args(self.workspace_dir)
|
||||
|
||||
feature_attrs = {
|
||||
0: {
|
||||
'my-field-1': float(1.2345),
|
||||
'my-field-2': numpy.float32(2.34567),
|
||||
'my-field-3': numpy.float64(3.45678),
|
||||
'my-field-4': int(4),
|
||||
'my-field-5': numpy.int16(5),
|
||||
'my-field-6': numpy.int32(6),
|
||||
},
|
||||
}
|
||||
target_vector_path = os.path.join(self.workspace_dir, 'target.gpkg')
|
||||
urban_nature_access._write_supply_demand_vector(
|
||||
args['admin_boundaries_vector_path'], feature_attrs,
|
||||
target_vector_path)
|
||||
|
||||
self.assertTrue(os.path.exists(target_vector_path))
|
||||
try:
|
||||
vector = gdal.OpenEx(target_vector_path)
|
||||
self.assertEqual(vector.GetLayerCount(), 1)
|
||||
layer = vector.GetLayer()
|
||||
self.assertEqual(len(layer.schema), len(feature_attrs[0]))
|
||||
self.assertEqual(layer.GetFeatureCount(), 1)
|
||||
feature = layer.GetFeature(0)
|
||||
for field_name, expected_field_value in feature_attrs[0].items():
|
||||
self.assertEqual(
|
||||
feature.GetField(field_name), expected_field_value)
|
||||
finally:
|
||||
feature = None
|
||||
layer = None
|
||||
vector = None
|
||||
|
||||
def test_urban_nature_proportion(self):
|
||||
"""UNA: Run the model with urban nature proportion."""
|
||||
from natcap.invest import urban_nature_access
|
||||
|
||||
args = _build_model_args(self.workspace_dir)
|
||||
args['search_radius_mode'] = urban_nature_access.RADIUS_OPT_UNIFORM
|
||||
args['search_radius'] = 1000
|
||||
with open(args['lulc_attribute_table'], 'a') as attr_table:
|
||||
attr_table.write("10,0.5,100\n")
|
||||
|
||||
# make sure our inputs validate
|
||||
validation_results = urban_nature_access.validate(args)
|
||||
self.assertEqual(validation_results, [])
|
||||
|
||||
urban_nature_access.execute(args)
|
||||
|
||||
def test_reclassify_urban_nature(self):
|
||||
"""UNA: Test for urban nature area reclassification."""
|
||||
from natcap.invest import urban_nature_access
|
||||
args = _build_model_args(self.workspace_dir)
|
||||
|
||||
# Rewrite the lulc attribute table to use proportions of urban nature.
|
||||
with open(args['lulc_attribute_table'], 'w') as attr_table:
|
||||
attr_table.write(textwrap.dedent(
|
||||
"""\
|
||||
lucode,urban_nature,search_radius_m
|
||||
0,0,100
|
||||
1,0.1,100
|
||||
2,0,100
|
||||
3,0.3,100
|
||||
4,0,100
|
||||
5,0.5,100
|
||||
6,0,100
|
||||
7,0.7,100
|
||||
8,0,100
|
||||
9,0.9,100
|
||||
"""))
|
||||
|
||||
urban_nature_area_path = os.path.join(
|
||||
self.workspace_dir, 'urban_nature_area.tif')
|
||||
|
||||
for limit_to_lucodes in (None, set([1, 3])):
|
||||
urban_nature_access._reclassify_urban_nature_area(
|
||||
args['lulc_raster_path'], args['lulc_attribute_table'],
|
||||
urban_nature_area_path,
|
||||
only_these_urban_nature_codes=limit_to_lucodes)
|
||||
|
||||
# The source lulc is randomized, so need to programmatically build
|
||||
# up the expected array.
|
||||
source_lulc_array = pygeoprocessing.raster_to_numpy_array(
|
||||
args['lulc_raster_path'])
|
||||
pixel_area = abs(_DEFAULT_PIXEL_SIZE[0] * _DEFAULT_PIXEL_SIZE[1])
|
||||
expected_array = numpy.zeros(source_lulc_array.shape,
|
||||
dtype=numpy.float32)
|
||||
for i in range(1, 10, 2):
|
||||
if limit_to_lucodes is not None:
|
||||
if i not in limit_to_lucodes:
|
||||
continue
|
||||
factor = float(f"0.{i}")
|
||||
expected_array[source_lulc_array == i] = factor * pixel_area
|
||||
|
||||
reclassified_array = pygeoprocessing.raster_to_numpy_array(
|
||||
urban_nature_area_path)
|
||||
numpy.testing.assert_array_almost_equal(
|
||||
reclassified_array, expected_array)
|
||||
|
||||
def test_validate(self):
|
||||
"""UNA: Basic test for validation."""
|
||||
from natcap.invest import urban_nature_access
|
||||
args = _build_model_args(self.workspace_dir)
|
||||
args['search_radius_mode'] = urban_nature_access.RADIUS_OPT_URBAN_NATURE
|
||||
args['search_radius_mode'] = (
|
||||
urban_nature_access.RADIUS_OPT_URBAN_NATURE)
|
||||
self.assertEqual(urban_nature_access.validate(args), [])
|
||||
|
|
|
@ -731,24 +731,6 @@ class CSVValidation(unittest.TestCase):
|
|||
header='column', header_name='field_a')
|
||||
self.assertEqual(error_msg, expected_msg)
|
||||
|
||||
def test_excel_missing_fieldnames(self):
|
||||
"""Validation: test that we can check missing fieldnames in excel."""
|
||||
from natcap.invest import validation
|
||||
|
||||
df = pandas.DataFrame([
|
||||
{'foo': 1, 'bar': 2, 'baz': 3},
|
||||
{'foo': 2, 'bar': 3, 'baz': 4},
|
||||
{'foo': 3, 'bar': 4, 'baz': 5}])
|
||||
|
||||
target_file = os.path.join(self.workspace_dir, 'test.xlsx')
|
||||
df.to_excel(target_file)
|
||||
|
||||
error_msg = validation.check_csv(
|
||||
target_file, columns={'field_a': {}}, excel_ok=True)
|
||||
expected_msg = validation.MESSAGES['MATCHED_NO_HEADERS'].format(
|
||||
header='column', header_name='field_a')
|
||||
self.assertEqual(error_msg, expected_msg)
|
||||
|
||||
def test_wrong_filetype(self):
|
||||
"""Validation: verify CSV type does not open pickles."""
|
||||
from natcap.invest import validation
|
||||
|
@ -761,12 +743,7 @@ class CSVValidation(unittest.TestCase):
|
|||
target_file = os.path.join(self.workspace_dir, 'test.pckl')
|
||||
df.to_pickle(target_file)
|
||||
|
||||
error_msg = validation.check_csv(
|
||||
target_file, columns={'field_a': {}}, excel_ok=True)
|
||||
self.assertEqual(error_msg, validation.MESSAGES['NOT_CSV_OR_EXCEL'])
|
||||
|
||||
error_msg = validation.check_csv(
|
||||
target_file, columns={'field_a': {}}, excel_ok=False)
|
||||
error_msg = validation.check_csv(target_file, columns={'field_a': {}})
|
||||
self.assertEqual(error_msg, validation.MESSAGES['NOT_CSV'])
|
||||
|
||||
def test_slow_to_open(self):
|
||||
|
|
|
@ -847,7 +847,7 @@ class WindEnergyRegressionTests(unittest.TestCase):
|
|||
wind_energy.execute(args)
|
||||
|
||||
self.assertTrue(
|
||||
"returned 0 features. If an AOI was" in str(cm.exception))
|
||||
"returned 0 features. This means the AOI " in str(cm.exception))
|
||||
|
||||
|
||||
class WindEnergyValidationTests(unittest.TestCase):
|
||||
|
|
|
@ -30,6 +30,16 @@ const config = {
|
|||
main: 'build/main/main.js',
|
||||
version: installerVersion,
|
||||
},
|
||||
extraFiles: [
|
||||
{
|
||||
from: '../LICENSE.txt',
|
||||
to: 'LICENSE.InVEST.txt',
|
||||
},
|
||||
{
|
||||
from: '../NOTICE.txt',
|
||||
to: 'NOTICE.InVEST.txt',
|
||||
},
|
||||
],
|
||||
extraResources: [
|
||||
{
|
||||
from: '../dist/invest',
|
||||
|
@ -43,10 +53,6 @@ const config = {
|
|||
from: 'resources/storage_token.txt',
|
||||
to: 'storage_token.txt',
|
||||
},
|
||||
{
|
||||
from: '../LICENSE.txt',
|
||||
to: 'LICENSE.InVEST.txt',
|
||||
},
|
||||
],
|
||||
appId: APP_ID,
|
||||
productName: PRODUCT_NAME,
|
||||
|
|
Before Width: | Height: | Size: 44 KiB After Width: | Height: | Size: 18 KiB |
Before Width: | Height: | Size: 70 KiB After Width: | Height: | Size: 46 KiB |
Before Width: | Height: | Size: 25 KiB After Width: | Height: | Size: 25 KiB |
|
@ -75,7 +75,7 @@ export const createWindow = async () => {
|
|||
|
||||
splashScreen = new BrowserWindow({
|
||||
width: 574, // dims set to match the image in splash.html
|
||||
height: 500,
|
||||
height: 479,
|
||||
transparent: true,
|
||||
frame: false,
|
||||
alwaysOnTop: false,
|
||||
|
|
|
@ -14,6 +14,7 @@ import Spinner from 'react-bootstrap/Spinner';
|
|||
import Tooltip from 'react-bootstrap/Tooltip';
|
||||
import OverlayTrigger from 'react-bootstrap/OverlayTrigger';
|
||||
import { MdClose, MdHome } from 'react-icons/md';
|
||||
import { AiOutlineTrademarkCircle } from 'react-icons/ai';
|
||||
|
||||
import HomeTab from './components/HomeTab';
|
||||
import InvestTab from './components/InvestTab';
|
||||
|
@ -293,6 +294,7 @@ export default class App extends React.Component {
|
|||
InVEST
|
||||
</Nav.Link>
|
||||
</Navbar.Brand>
|
||||
<AiOutlineTrademarkCircle className="rtm" />
|
||||
</Col>
|
||||
<Col className="navbar-middle">
|
||||
<Nav
|
||||
|
|
|
@ -30,7 +30,8 @@ const { ipcRenderer } = window.Workbench.electron;
|
|||
* @returns {string} - the filtered and formatted part of the message
|
||||
*/
|
||||
function filterSpatialOverlapFeedback(message, filepath) {
|
||||
const newPrefix = i18n.t('Bounding box does not intersect at least one other:');
|
||||
const newPrefix = i18n.t(
|
||||
'Not all of the spatial layers overlap each other. Bounding box:');
|
||||
const bbox = message.split(`${filepath}:`).pop().split('|')[0];
|
||||
const bboxFormatted = bbox.split(' ').map(
|
||||
(str) => str.padEnd(22, ' ')
|
||||
|
@ -167,7 +168,7 @@ export default function ArgInput(props) {
|
|||
// Messages with this pattern include validation feedback about
|
||||
// multiple inputs, but the whole message is repeated for each input.
|
||||
// It's more readable if filtered on the individual input.
|
||||
const pattern = 'Bounding boxes do not intersect';
|
||||
const pattern = 'Not all of the spatial layers overlap each other';
|
||||
if (validationMessage.startsWith(pattern)) {
|
||||
validationMessage = filterSpatialOverlapFeedback(
|
||||
validationMessage, value
|
||||
|
|
|
@ -24,7 +24,7 @@ ReactDom.render(
|
|||
<img
|
||||
src={investLogo}
|
||||
width="191"
|
||||
height="167"
|
||||
height="159"
|
||||
alt="InVEST logo"
|
||||
/>
|
||||
<div id="invest-version">
|
||||
|
@ -32,43 +32,65 @@ ReactDom.render(
|
|||
<span id="version-string" />
|
||||
</div>
|
||||
<p id="invest-copyright">
|
||||
{t('Copyright 2022, The Natural Capital Project')}
|
||||
{t('Copyright 2023, The Natural Capital Project')}
|
||||
</p>
|
||||
</div>
|
||||
<br />
|
||||
<div id="links">
|
||||
<p>
|
||||
{t('Documentation: ')}
|
||||
<a
|
||||
href="http://releases.naturalcapitalproject.org/invest-userguide/latest/"
|
||||
>
|
||||
http://releases.naturalcapitalproject.org/invest-userguide/latest/
|
||||
</a>
|
||||
</p>
|
||||
<p>
|
||||
{t('Homepage: ')}
|
||||
<a
|
||||
href="https://naturalcapitalproject.stanford.edu/"
|
||||
>
|
||||
https://naturalcapitalproject.stanford.edu/
|
||||
</a>
|
||||
</p>
|
||||
<p>
|
||||
{t('Project page: ')}
|
||||
<a
|
||||
href="https://github.com/natcap/invest"
|
||||
>
|
||||
https://github.com/natcap/invest
|
||||
</a>
|
||||
</p>
|
||||
<p>
|
||||
{t('License: ')}
|
||||
<a
|
||||
href="https://github.com/natcap/invest/blob/master/LICENSE.txt"
|
||||
>
|
||||
BSD 3-clause
|
||||
</a>
|
||||
</p>
|
||||
<table>
|
||||
<tbody>
|
||||
<tr>
|
||||
<td>{t('Documentation')}</td>
|
||||
<td>
|
||||
<a
|
||||
href="http://releases.naturalcapitalproject.org/invest-userguide/latest/"
|
||||
>
|
||||
http://releases.naturalcapitalproject.org/invest-userguide/latest/
|
||||
</a>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>{t('Homepage')}</td>
|
||||
<td>
|
||||
<a
|
||||
href="https://naturalcapitalproject.stanford.edu/"
|
||||
>
|
||||
https://naturalcapitalproject.stanford.edu/
|
||||
</a>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>{t('Project page')}</td>
|
||||
<td>
|
||||
<a
|
||||
href="https://github.com/natcap/invest"
|
||||
>
|
||||
https://github.com/natcap/invest
|
||||
</a>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>{t('License')}</td>
|
||||
<td>
|
||||
<a
|
||||
href="https://github.com/natcap/invest/blob/main/LICENSE.txt"
|
||||
>
|
||||
Apache 2.0
|
||||
</a>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>{t('InVEST Trademark and Logo Use Policy')}</td>
|
||||
<td>
|
||||
<a
|
||||
href="https://naturalcapitalproject.stanford.edu/invest-trademark-and-logo-use-policy"
|
||||
>
|
||||
Trademark and Logo Policy
|
||||
</a>
|
||||
</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
<div id="licenses">
|
||||
<h4>{t('Open-Source Licenses:')}</h4>
|
||||
|
|
|
@ -55,13 +55,13 @@ ReactDom.render(
|
|||
<img
|
||||
src={investLogo}
|
||||
width="143"
|
||||
height="125"
|
||||
height="119"
|
||||
alt="invest logo"
|
||||
/>
|
||||
<img
|
||||
src={natcapLogo}
|
||||
width="143"
|
||||
height="125"
|
||||
height="119"
|
||||
alt="Natural Capital Project logo"
|
||||
/>
|
||||
</div>
|
||||
|
|
Before Width: | Height: | Size: 69 KiB After Width: | Height: | Size: 48 KiB |
|
@ -1,5 +1,5 @@
|
|||
:root {
|
||||
font-size: 16px;
|
||||
font-size: 16px;
|
||||
}
|
||||
|
||||
|
||||
|
@ -12,97 +12,106 @@
|
|||
to appear with height: 100%, even when parent has a fixed height;
|
||||
*/
|
||||
body {
|
||||
--invest-green: #148F68;
|
||||
--header-height: 64px;
|
||||
--content-height: calc(100vh - var(--header-height));
|
||||
min-height: 100vh;
|
||||
background-color: rgba(0,0,0,0);
|
||||
margin: 0;
|
||||
overflow-y: hidden;
|
||||
--invest-green: #148F68;
|
||||
--header-height: 64px;
|
||||
--content-height: calc(100vh - var(--header-height));
|
||||
min-height: 100vh;
|
||||
background-color: rgba(0,0,0,0);
|
||||
margin: 0;
|
||||
overflow-y: hidden;
|
||||
}
|
||||
|
||||
/*Top Navigation*/
|
||||
|
||||
.navbar {
|
||||
border-bottom: 3px solid var(--invest-green);
|
||||
padding: 0;
|
||||
height: var(--header-height)
|
||||
border-bottom: 3px solid var(--invest-green);
|
||||
padding: 0;
|
||||
height: var(--header-height)
|
||||
}
|
||||
|
||||
.navbar .row {
|
||||
align-items: end;
|
||||
align-items: end;
|
||||
}
|
||||
|
||||
.navbar-brand {
|
||||
padding-bottom: 0;
|
||||
padding-bottom: 0;
|
||||
margin-right: 0;
|
||||
}
|
||||
|
||||
.navbar-brand:hover {
|
||||
text-decoration-line: underline;
|
||||
text-decoration-thickness: 3px;
|
||||
text-decoration-line: underline;
|
||||
text-decoration-thickness: 3px;
|
||||
}
|
||||
|
||||
.navbar-middle {
|
||||
flex-shrink: 1;
|
||||
min-width: 0;
|
||||
padding-left: 0;
|
||||
padding-right: 0;
|
||||
flex-shrink: 1;
|
||||
min-width: 0;
|
||||
padding-left: 0;
|
||||
padding-right: 0;
|
||||
}
|
||||
|
||||
.navbar-right {
|
||||
max-width: fit-content;
|
||||
margin-left: 0.5rem;
|
||||
margin-right: -15px;
|
||||
padding-left: 0;
|
||||
padding-right: 0;
|
||||
max-width: fit-content;
|
||||
margin-left: 0.5rem;
|
||||
margin-right: -15px;
|
||||
padding-left: 0;
|
||||
padding-right: 0;
|
||||
}
|
||||
|
||||
.navbar-brand .nav-link {
|
||||
color: var(--invest-green);
|
||||
font-weight: 600;
|
||||
font-size: 2.0rem;
|
||||
letter-spacing: 1px;
|
||||
padding-bottom: 0;
|
||||
padding-left: 0.7rem;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
color: var(--invest-green);
|
||||
font-weight: 600;
|
||||
font-size: 2.0rem;
|
||||
letter-spacing: 1px;
|
||||
padding-bottom: 0;
|
||||
padding-left: 0.7rem;
|
||||
padding-right: 0;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
/*tricky to align icon with text*/
|
||||
.navbar-brand .nav-link svg {
|
||||
margin-bottom: -0.2rem;
|
||||
margin-bottom: -0.2rem;
|
||||
}
|
||||
|
||||
/*registered trademark icon styling*/
|
||||
.rtm {
|
||||
color: var(--invest-green);
|
||||
font-size: 0.5rem;
|
||||
margin-bottom: 0.3rem;
|
||||
}
|
||||
|
||||
.navbar-nav.nav-tabs {
|
||||
border-bottom: 0;
|
||||
overflow-x: hidden;
|
||||
overflow-y: hidden;
|
||||
border-bottom: 0;
|
||||
overflow-x: hidden;
|
||||
overflow-y: hidden;
|
||||
}
|
||||
|
||||
.navbar-nav .nav-item {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
max-width: fit-content;
|
||||
white-space: nowrap;
|
||||
flex: 1; /*size by available space, not by content*/
|
||||
min-width: 0; /*and allow it to shrink to fit container*/
|
||||
background-color: transparent;
|
||||
border: 2px solid #999999;
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
max-width: fit-content;
|
||||
white-space: nowrap;
|
||||
flex: 1; /*size by available space, not by content*/
|
||||
min-width: 0; /*and allow it to shrink to fit container*/
|
||||
background-color: transparent;
|
||||
border: 2px solid #999999;
|
||||
border-bottom: 0;
|
||||
border-radius: 5px 5px 0 0;
|
||||
}
|
||||
|
||||
.navbar-nav .nav-item.active {
|
||||
border: 3px solid var(--invest-green);
|
||||
border-bottom: 0;
|
||||
background-color: ghostwhite;
|
||||
border: 3px solid var(--invest-green);
|
||||
border-bottom: 0;
|
||||
background-color: ghostwhite;
|
||||
}
|
||||
|
||||
.navbar-light .navbar-nav .nav-link {
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
min-width: 0;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
min-width: 0;
|
||||
background-color: transparent;
|
||||
color: #999999;
|
||||
border: 0;
|
||||
|
@ -111,294 +120,294 @@ body {
|
|||
}
|
||||
|
||||
.close-tab.btn {
|
||||
background-color: transparent;
|
||||
border-color: transparent;
|
||||
color: #6c757d;
|
||||
margin-left: 0.1rem;
|
||||
padding-left: 0.1rem;
|
||||
padding-right: 0.1rem;
|
||||
padding-top: 0.1rem;
|
||||
padding-bottom: 0.1rem;
|
||||
height: fit-content;
|
||||
background-color: transparent;
|
||||
border-color: transparent;
|
||||
color: #6c757d;
|
||||
margin-left: 0.1rem;
|
||||
padding-left: 0.1rem;
|
||||
padding-right: 0.1rem;
|
||||
padding-top: 0.1rem;
|
||||
padding-bottom: 0.1rem;
|
||||
height: fit-content;
|
||||
}
|
||||
.close-tab:hover, .close-tab:focus, .close-tab:active {
|
||||
border-color: #6c757d;
|
||||
background-color: #6c757d;
|
||||
color: white;
|
||||
border-color: #6c757d;
|
||||
background-color: #6c757d;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.progress {
|
||||
display: inline-flex;
|
||||
height: 2rem;
|
||||
font-size: 1rem;
|
||||
background-color: #aeb1b3;
|
||||
display: inline-flex;
|
||||
height: 2rem;
|
||||
font-size: 1rem;
|
||||
background-color: #aeb1b3;
|
||||
}
|
||||
|
||||
.progress-bar {
|
||||
padding-left: 0.5rem;
|
||||
padding-right: 0.5rem;
|
||||
padding-left: 0.5rem;
|
||||
padding-right: 0.5rem;
|
||||
}
|
||||
|
||||
.navbar .alert {
|
||||
padding: 0.4rem;
|
||||
padding: 0.4rem;
|
||||
}
|
||||
|
||||
.navbar .row .text-right {
|
||||
white-space: nowrap;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.settings-icon-btn, .settings-icon-btn:hover {
|
||||
background-color: transparent;
|
||||
border-color: transparent;
|
||||
background-color: transparent;
|
||||
border-color: transparent;
|
||||
}
|
||||
|
||||
.settings-icon {
|
||||
color: black;
|
||||
font-size: 2rem;
|
||||
vertical-align: text-bottom;
|
||||
color: black;
|
||||
font-size: 2rem;
|
||||
vertical-align: text-bottom;
|
||||
}
|
||||
|
||||
.language-icon {
|
||||
color: black;
|
||||
font-size: 1rem;
|
||||
vertical-align: text-bottom;
|
||||
margin-right: 0.2rem;
|
||||
color: black;
|
||||
font-size: 1rem;
|
||||
vertical-align: text-bottom;
|
||||
margin-right: 0.2rem;
|
||||
}
|
||||
|
||||
/* add padding to accomodate the width of a scroll bar
|
||||
otherwise the bar will make overall width of content
|
||||
exceed 100% of window.*/
|
||||
#home-tab-content {
|
||||
padding-right: 16px;
|
||||
padding-right: 16px;
|
||||
}
|
||||
|
||||
/* Home Tab */
|
||||
|
||||
.invest-list-container {
|
||||
height: var(--content-height);
|
||||
overflow-y: auto;
|
||||
height: var(--content-height);
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
.invest-list-group {
|
||||
/*do not constrain height here because it can cause more
|
||||
columns to be created. Constrain it in parent instead.*/
|
||||
display: block;
|
||||
column-count: 2;
|
||||
white-space: nowrap;
|
||||
/*do not constrain height here because it can cause more
|
||||
columns to be created. Constrain it in parent instead.*/
|
||||
display: block;
|
||||
column-count: 2;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.invest-list-group .list-group-item {
|
||||
border-top: none;
|
||||
border-right: none;
|
||||
border-left: none;
|
||||
border-bottom: 1px solid rgba(0,0,0,.125);
|
||||
margin-bottom: 0;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
border-top: none;
|
||||
border-right: none;
|
||||
border-left: none;
|
||||
border-bottom: 1px solid rgba(0,0,0,.125);
|
||||
margin-bottom: 0;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
}
|
||||
|
||||
.invest-list-group .list-group-item:last-child {
|
||||
border-radius: 0;
|
||||
border-radius: 0;
|
||||
}
|
||||
|
||||
.invest-button {
|
||||
color: var(--invest-green);
|
||||
font-weight: 600;
|
||||
color: var(--invest-green);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.invest-button:hover {
|
||||
color: white;
|
||||
background-color: var(--invest-green);
|
||||
opacity: 75%;
|
||||
font-weight: 600;
|
||||
color: white;
|
||||
background-color: var(--invest-green);
|
||||
opacity: 75%;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.open-button-col {
|
||||
max-width: fit-content;
|
||||
padding-right: 0;
|
||||
max-width: fit-content;
|
||||
padding-right: 0;
|
||||
}
|
||||
|
||||
.recent-header-col {
|
||||
padding-left: 0;
|
||||
padding-left: 0;
|
||||
}
|
||||
|
||||
.recent-header-col .default-text {
|
||||
padding-left: 0;
|
||||
text-align: right;
|
||||
padding-left: 0;
|
||||
text-align: right;
|
||||
}
|
||||
|
||||
.recent-job-card-col {
|
||||
height: var(--content-height);
|
||||
overflow-y: auto;
|
||||
padding-right: 0.5rem;
|
||||
height: var(--content-height);
|
||||
overflow-y: auto;
|
||||
padding-right: 0.5rem;
|
||||
}
|
||||
|
||||
.recent-job-card-col .container {
|
||||
padding-top: 0.25rem;
|
||||
padding-bottom: 0.25rem;
|
||||
padding-right: 0.5rem;
|
||||
padding-top: 0.25rem;
|
||||
padding-bottom: 0.25rem;
|
||||
padding-right: 0.5rem;
|
||||
}
|
||||
|
||||
.recent-job-card-col .container .row {
|
||||
align-items: center;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.recent-job-card {
|
||||
width: inherit;
|
||||
margin-bottom: 1rem;
|
||||
padding: 0;
|
||||
height: fit-content;
|
||||
filter: drop-shadow(2px 2px 2px grey);
|
||||
width: inherit;
|
||||
margin-bottom: 1rem;
|
||||
padding: 0;
|
||||
height: fit-content;
|
||||
filter: drop-shadow(2px 2px 2px grey);
|
||||
}
|
||||
|
||||
.card-body {
|
||||
padding: 0;
|
||||
width: inherit;
|
||||
padding: 0;
|
||||
width: inherit;
|
||||
}
|
||||
|
||||
.card-header {
|
||||
background-color: var(--invest-green);
|
||||
filter: opacity(0.75);
|
||||
margin-bottom: 1rem;
|
||||
margin-left: -0.1rem;
|
||||
margin-right: -0.05rem;
|
||||
background-color: var(--invest-green);
|
||||
filter: opacity(0.75);
|
||||
margin-bottom: 1rem;
|
||||
margin-left: -0.1rem;
|
||||
margin-right: -0.05rem;
|
||||
}
|
||||
|
||||
.card-header .header-title {
|
||||
color: white;
|
||||
font-size: 1.35rem;
|
||||
font-weight: 600;
|
||||
letter-spacing: 0.5px;
|
||||
color: white;
|
||||
font-size: 1.35rem;
|
||||
font-weight: 600;
|
||||
letter-spacing: 0.5px;
|
||||
}
|
||||
|
||||
.card-title {
|
||||
padding-left: 1.25rem;
|
||||
padding-left: 1.25rem;
|
||||
}
|
||||
|
||||
.card-title .text-heading {
|
||||
color: gray;
|
||||
color: gray;
|
||||
}
|
||||
|
||||
.card-title .text-mono {
|
||||
font-family: monospace;
|
||||
font-size: 1.2rem;
|
||||
font-family: monospace;
|
||||
font-size: 1.2rem;
|
||||
}
|
||||
|
||||
.card-footer {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
white-space: nowrap;
|
||||
padding-right: 1rem;
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
white-space: nowrap;
|
||||
padding-right: 1rem;
|
||||
}
|
||||
|
||||
.card-footer span {
|
||||
text-overflow: ellipsis;
|
||||
text-overflow: ellipsis;
|
||||
}
|
||||
|
||||
.card-footer .status {
|
||||
overflow: hidden;
|
||||
font-style: italic;
|
||||
text-transform: capitalize;
|
||||
overflow: hidden;
|
||||
font-style: italic;
|
||||
text-transform: capitalize;
|
||||
}
|
||||
|
||||
.card-footer .status-error {
|
||||
color: red;
|
||||
color: red;
|
||||
}
|
||||
|
||||
.card-footer .status-success {
|
||||
color: var(--invest-green);
|
||||
color: var(--invest-green);
|
||||
}
|
||||
|
||||
.card-footer .timestamp {
|
||||
padding-right: 2rem;
|
||||
padding-right: 2rem;
|
||||
}
|
||||
|
||||
/* InVEST Model Tab */
|
||||
|
||||
.invest-main-col {
|
||||
/* main col can grow & shrink, sidebar cannot. */
|
||||
flex: 1 1 0;
|
||||
padding-left: 0;
|
||||
padding-right: 0;
|
||||
/* main col can grow & shrink, sidebar cannot. */
|
||||
flex: 1 1 0;
|
||||
padding-left: 0;
|
||||
padding-right: 0;
|
||||
}
|
||||
|
||||
.invest-main-col .tab-content {
|
||||
background-color: ghostwhite;
|
||||
background-color: ghostwhite;
|
||||
}
|
||||
|
||||
.invest-sidebar-col {
|
||||
/* sidebar cannot grow or shrink, main col can.
|
||||
* sidebar width tries to be 25%.
|
||||
*/
|
||||
flex: 0 0 25%;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
padding-right: 0;
|
||||
padding-bottom: 1rem;
|
||||
/* sidebar cannot grow or shrink, main col can.
|
||||
* sidebar width tries to be 25%.
|
||||
*/
|
||||
flex: 0 0 25%;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
padding-right: 0;
|
||||
padding-bottom: 1rem;
|
||||
}
|
||||
|
||||
.invest-sidebar-col .nav-link {
|
||||
color: #000;
|
||||
border-right: 5px solid transparent;
|
||||
border-radius: 0;
|
||||
font-size: 1.3rem;
|
||||
font-weight: 550;
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
padding-right: 0;
|
||||
color: #000;
|
||||
border-right: 5px solid transparent;
|
||||
border-radius: 0;
|
||||
font-size: 1.3rem;
|
||||
font-weight: 550;
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
padding-right: 0;
|
||||
}
|
||||
|
||||
.invest-sidebar-col .nav-link svg {
|
||||
font-size: 2rem;
|
||||
font-size: 2rem;
|
||||
}
|
||||
|
||||
.invest-sidebar-col .nav-link:hover, .invest-sidebar-col .nav-link:focus {
|
||||
background-color: rgb(240, 240, 240);
|
||||
border-right: 5px solid rgb(240, 240, 240);
|
||||
border-radius: 0;
|
||||
background-color: rgb(240, 240, 240);
|
||||
border-right: 5px solid rgb(240, 240, 240);
|
||||
border-radius: 0;
|
||||
}
|
||||
|
||||
.invest-sidebar-col .nav-link.disabled {
|
||||
color: #888888;
|
||||
color: #888888;
|
||||
}
|
||||
|
||||
.invest-sidebar-col .nav-link.active {
|
||||
color: #000;
|
||||
background-color: ghostwhite;
|
||||
border-right: 5px solid var(--invest-green);
|
||||
border-radius: 0;
|
||||
color: #000;
|
||||
background-color: ghostwhite;
|
||||
border-right: 5px solid var(--invest-green);
|
||||
border-radius: 0;
|
||||
}
|
||||
|
||||
.sidebar-row {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
font-size: 1rem;
|
||||
padding-left: 1rem;
|
||||
padding-right: 1rem;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
font-size: 1rem;
|
||||
padding-left: 1rem;
|
||||
padding-right: 1rem;
|
||||
}
|
||||
|
||||
.sidebar-buttons {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
flex-wrap: nowrap;
|
||||
align-items: flex-start;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
flex-wrap: nowrap;
|
||||
align-items: flex-start;
|
||||
}
|
||||
|
||||
.sidebar-buttons .btn {
|
||||
white-space: nowrap;
|
||||
background: none;
|
||||
border: none;
|
||||
color: black;
|
||||
padding-left: 0;
|
||||
white-space: nowrap;
|
||||
background: none;
|
||||
border: none;
|
||||
color: black;
|
||||
padding-left: 0;
|
||||
}
|
||||
|
||||
.sidebar-buttons .alert {
|
||||
margin-top: 1rem;
|
||||
margin-top: 1rem;
|
||||
}
|
||||
|
||||
.sidebar-links a {
|
||||
padding-top: 0.5rem;
|
||||
padding-top: 0.5rem;
|
||||
}
|
||||
|
||||
.sidebar-footer {
|
||||
|
@ -407,42 +416,42 @@ exceed 100% of window.*/
|
|||
|
||||
/* Model Status Alert */
|
||||
.invest-sidebar-col .sidebar-footer .alert {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
justify-content: center;
|
||||
order: -1;
|
||||
font-family: monospace;
|
||||
font-weight: bold;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
justify-content: center;
|
||||
order: -1;
|
||||
font-family: monospace;
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
.invest-sidebar-col .alert .btn {
|
||||
flex: 1;
|
||||
margin-top: 1rem;
|
||||
font-size: 1.1rem;
|
||||
font-family: revert;
|
||||
letter-spacing: 0.03rem;
|
||||
flex: 1;
|
||||
margin-top: 1rem;
|
||||
font-size: 1.1rem;
|
||||
font-family: revert;
|
||||
letter-spacing: 0.03rem;
|
||||
}
|
||||
|
||||
/* The Run button */
|
||||
.sidebar-footer .btn-primary {
|
||||
font-size: 1.5rem;
|
||||
font-weight: 600;
|
||||
letter-spacing: 0.1rem;
|
||||
font-size: 1.5rem;
|
||||
font-weight: 600;
|
||||
letter-spacing: 0.1rem;
|
||||
}
|
||||
|
||||
.sidebar-footer .btn-primary .spinner-border {
|
||||
margin-left: 1rem;
|
||||
margin-bottom: 0.25rem;
|
||||
margin-left: 1rem;
|
||||
margin-bottom: 0.25rem;
|
||||
}
|
||||
|
||||
/* InVEST Setup Tab */
|
||||
.args-form {
|
||||
height: var(--content-height);
|
||||
width: 100%;
|
||||
overflow-y: auto;
|
||||
overflow-x: hidden;
|
||||
padding-left: 2rem;
|
||||
padding-right: 1rem;
|
||||
height: var(--content-height);
|
||||
width: 100%;
|
||||
overflow-y: auto;
|
||||
overflow-x: hidden;
|
||||
padding-left: 2rem;
|
||||
padding-right: 1rem;
|
||||
}
|
||||
|
||||
/* InVEST Argument Forms */
|
||||
|
@ -451,46 +460,46 @@ exceed 100% of window.*/
|
|||
}
|
||||
|
||||
.arg-group {
|
||||
margin-top: 1.0rem;
|
||||
margin-bottom: 1.5rem;
|
||||
padding-bottom: 0.5rem;
|
||||
border-bottom: 1px dotted;
|
||||
margin-top: 1.0rem;
|
||||
margin-bottom: 1.5rem;
|
||||
padding-bottom: 0.5rem;
|
||||
border-bottom: 1px dotted;
|
||||
}
|
||||
|
||||
.arg-group:last-of-type {
|
||||
border-bottom: none;
|
||||
border-bottom: none;
|
||||
}
|
||||
|
||||
.arg-hide {
|
||||
display: none;
|
||||
display: none;
|
||||
}
|
||||
|
||||
.arg-disable .form-label {
|
||||
color: #a5a5a5;
|
||||
color: #a5a5a5;
|
||||
}
|
||||
|
||||
.args-form .form-control:disabled {
|
||||
color: #888888;
|
||||
color: #888888;
|
||||
}
|
||||
|
||||
.args-form .form-group {
|
||||
align-items: center;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
#argname {
|
||||
text-transform: capitalize;
|
||||
font-weight: 600;
|
||||
text-transform: capitalize;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.args-form .form-control {
|
||||
font-family: monospace;
|
||||
font-size:1.3rem;
|
||||
font-family: monospace;
|
||||
font-size:1.3rem;
|
||||
}
|
||||
|
||||
.args-form .form-control[type=text] {
|
||||
/*always hold space for a validation mark so the rightmost
|
||||
text is never hidden by the mark when it appears.*/
|
||||
padding-right: 2em;
|
||||
/*always hold space for a validation mark so the rightmost
|
||||
text is never hidden by the mark when it appears.*/
|
||||
padding-right: 2em;
|
||||
}
|
||||
|
||||
input.input-dragging {
|
||||
|
@ -498,93 +507,93 @@ input.input-dragging {
|
|||
}
|
||||
|
||||
input[type=text]::placeholder {
|
||||
font-style: italic;
|
||||
color: #a5a5a5;
|
||||
font-style: italic;
|
||||
color: #a5a5a5;
|
||||
}
|
||||
|
||||
/* Default sized toggle switch too small */
|
||||
.form-switch {
|
||||
transform:scale(1.5);
|
||||
margin-left: 1rem;
|
||||
margin-top: 0.6rem;
|
||||
transform:scale(1.5);
|
||||
margin-left: 1rem;
|
||||
margin-top: 0.6rem;
|
||||
}
|
||||
|
||||
.invalid-feedback {
|
||||
font-size: 0.9rem;
|
||||
font-family: monospace;
|
||||
white-space: pre-wrap;
|
||||
padding-left: 3rem;
|
||||
text-indent: -3rem;
|
||||
font-size: 0.9rem;
|
||||
font-family: monospace;
|
||||
white-space: pre-wrap;
|
||||
padding-left: 3rem;
|
||||
text-indent: -3rem;
|
||||
}
|
||||
|
||||
.args-form svg {
|
||||
font-size: 1.5rem;
|
||||
font-size: 1.5rem;
|
||||
}
|
||||
|
||||
/* InVEST Log Tab */
|
||||
#log-display {
|
||||
overflow: auto;
|
||||
white-space: pre-wrap;
|
||||
height: var(--content-height);
|
||||
overflow: auto;
|
||||
white-space: pre-wrap;
|
||||
height: var(--content-height);
|
||||
}
|
||||
|
||||
#log-display {
|
||||
font-family: monospace;
|
||||
font-size:1rem;
|
||||
color: #000;
|
||||
padding-right: 0;
|
||||
font-family: monospace;
|
||||
font-size:1rem;
|
||||
color: #000;
|
||||
padding-right: 0;
|
||||
}
|
||||
|
||||
#log-display .invest-log-primary {
|
||||
color: #000;
|
||||
font-weight: bold;
|
||||
color: #000;
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
#log-display .invest-log-primary-warning {
|
||||
color: #f13232;
|
||||
color: #f13232;
|
||||
}
|
||||
|
||||
#log-display .invest-log-error {
|
||||
color: #f13232;
|
||||
font-weight: bold;
|
||||
color: #f13232;
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
/* Download Data Modal */
|
||||
.download-data-modal .modal-dialog {
|
||||
overflow-y: initial;
|
||||
overflow-y: initial;
|
||||
}
|
||||
|
||||
.download-data-modal .modal-body {
|
||||
height: 80vh;
|
||||
overflow-y: auto;
|
||||
height: 80vh;
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
/* Settings Modal */
|
||||
.settings-modal .modal-dialog {
|
||||
/*Hardcoding prevents dynamic resizing, which is convenient.
|
||||
And it is okay because we set the minimum electron window width
|
||||
at 800px when we create it.*/
|
||||
max-width: 600px;
|
||||
/*Hardcoding prevents dynamic resizing, which is convenient.
|
||||
And it is okay because we set the minimum electron window width
|
||||
at 800px when we create it.*/
|
||||
max-width: 600px;
|
||||
}
|
||||
|
||||
/* Save As modal */
|
||||
.save-as-modal svg {
|
||||
margin-bottom: 0.2rem;
|
||||
margin-bottom: 0.2rem;
|
||||
}
|
||||
|
||||
.confirm-modal .modal-content {
|
||||
background-color: papayawhip;
|
||||
margin-top: 100px;
|
||||
background-color: papayawhip;
|
||||
margin-top: 100px;
|
||||
}
|
||||
|
||||
.error-boundary {
|
||||
max-width:600px;
|
||||
margin: 0 auto;
|
||||
margin-top: 5rem;
|
||||
border-color: black;
|
||||
border-style: dotted;
|
||||
max-width:600px;
|
||||
margin: 0 auto;
|
||||
margin-top: 5rem;
|
||||
border-color: black;
|
||||
border-style: dotted;
|
||||
}
|
||||
|
||||
.error-boundary .btn {
|
||||
margin: 1rem;
|
||||
margin: 1rem;
|
||||
}
|
||||
|
|
|
@ -592,8 +592,8 @@ describe('Misc form validation stuff', () => {
|
|||
const rasterValue = './raster.tif';
|
||||
const expectedVal2 = '-79.0198012081401';
|
||||
const rasterBox = `[${expectedVal2}, 26.481559513537064, -78.37173806200593, 27.268061760228512]`;
|
||||
const message = `Bounding boxes do not intersect: ${vectorValue}: ${vectorBox} | ${rasterValue}: ${rasterBox}`;
|
||||
const newPrefix = 'Bounding box does not intersect at least one other:';
|
||||
const message = `Not all of the spatial layers overlap each other. All bounding boxes must intersect: ${vectorValue}: ${vectorBox} | ${rasterValue}: ${rasterBox}`;
|
||||
const newPrefix = 'Not all of the spatial layers overlap each other. Bounding box:';
|
||||
const vectorMessage = new RegExp(`${newPrefix}\\s*\\[${expectedVal1}`);
|
||||
const rasterMessage = new RegExp(`${newPrefix}\\s*\\[${expectedVal2}`);
|
||||
|
||||
|
|