Skip to content

Commit

Permalink
Release 1.1.0 (#39)
Browse files Browse the repository at this point in the history
* add test script

* change cease to closure

* Delete tests/pyincore/analyses/businessclosure directory

* removed deprecated methods and classes

* removed outdated test and modified  building period method

* Add Lisa Wang fixes.

* removed files and commented code

* #7: Create Repair Curve class to be used with residential recovery analysis

* fixed pep8 issues (#21)

* fixed pep8 issues

* Pyincore util to convert population dislocation output for heatmap #3 (#5)

* add utility function to save as shapefile

* add filter

* write a test

* fix bug

* add snippet of code to test util and upload

* Added conda recipe (#18)

* Added conda recipe

* changed python version to 3.8

* #6: IncoreClient will show the error messages that the api returns. Pytests added (#16)

updated client error messages returned

* added github action for automatic docs build (#24)

* added github action for automatic docs build

* changed the version info in master branch

* modified based on comment

* renamed the file

* #26: run pyincore pytests (#33)

* #26: run pyincore pytests

* #26: Added cache to test performance

* #26: another push to test performance difference

* #26: Move caching to correct place

* #26:another test

* #26: test without mamba

* #26: re-enable mamba

* 34 add percentage in utils cge output to json (#36)

* add percentages, refactor test script, pep8

* joplin pd files

* #28 add joplin building functionality recovery time model (#30)

* restoration model,documentation and test script

* add global target functionality level

* add utility with mean and dev, add numpy arrays, filter no hazard

* FL mapping to Limit states, filter by archetype and chain in tests

* pep8 changes

* Need to pass the dataset type as a parameter to from dataframe method #8 (#35)

* add data_type parameter to pass in

* put correct dataset type to set_result

* add test

* fix bug and add one example test on epf

* remove test code

* add to test_format.py

* fix format

* #9 convert pd outputs to json (#27)

* init functions with test jsons

* ad dislocation by housing

* add input path and output files

* add test script

* expand jsons with population dislocation keys

* test script and list bug

* race, tenure, total calculations

* pop disl id with hhinc column, improved income json

* improve household code, improved total json

* delete test csv output

* doc modules and csv output to json script

* full test script

* change class name to match the python file name

* change names of keys based on the SC input, add percentages

* add percentages, check total zeroes, improve test script

* except missing category key

* add files to test script

* pep8 changes

* add comma in the list of categories

* merge conflicts in tests

* combine popdisl scripts and tests

* comment out service upload

* fix heatmap filter bug

* remove csv output

* Updated CHANGELOG and set version to 1.1.0

Co-authored-by: ncsa-mo <[email protected]>
Co-authored-by: Diego Calderon <[email protected]>
Co-authored-by: Santiago Nunez-Corrales <[email protected]>
Co-authored-by: Gowtham Naraharisetty <[email protected]>
Co-authored-by: Diego <[email protected]>
Co-authored-by: Chen Wang <[email protected]>
Co-authored-by: YONG WOOK KIM <[email protected]>
Co-authored-by: Gowtham Naraharisetty <[email protected]>
  • Loading branch information
9 people authored Oct 27, 2021
1 parent 661adff commit b0b0675
Show file tree
Hide file tree
Showing 58 changed files with 1,560 additions and 1,743 deletions.
87 changes: 87 additions & 0 deletions .github/workflows/doc.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
name: DOC

on:
push:
branches:
- master
- develop
- 'release/*'

tags:
- '*'

pull_request:
branches:
- master
- develop
- 'release/*'

env:
MAIN_REPO: IN-CORE/pyincore

jobs:

# ----------------------------------------------------------------------
# DOCKER BUILD
# ----------------------------------------------------------------------
docker:
runs-on: ubuntu-latest

steps:
# checkout source code
- uses: actions/checkout@v2

# calculate some variables that are used later
- name: version information
run: |
if [ "${{ github.event.release.target_commitish }}" != "" ]; then
BRANCH="${{ github.event.release.target_commitish }}"
elif [[ $GITHUB_REF =~ pull ]]; then
BRANCH="$(echo $GITHUB_REF | sed 's#refs/pull/\([0-9]*\)/merge#PR-\1#')"
else
BRANCH=${GITHUB_REF##*/}
fi
echo "GITHUB_BRANCH=${BRANCH}" >> $GITHUB_ENV
if [ "$BRANCH" == "master" ]; then
version=$(awk -F= '/^release/ { print $2}' docs/source/conf.py | sed "s/[ ']//g")
tags="latest"
oldversion=""
while [ "${oldversion}" != "${version}" ]; do
oldversion="${version}"
tags="${tags},${version}"
version=${version%.*}
done
echo "VERSION=${version}" >> $GITHUB_ENV
echo "TAGS=${tags}" >> $GITHUB_ENV
elif [ "$BRANCH" == "develop" ]; then
echo "VERSION=develop" >> $GITHUB_ENV
echo "TAGS=develop" >> $GITHUB_ENV
else
echo "VERSION=testing" >> $GITHUB_ENV
echo "TAGS=${BRANCH}" >> $GITHUB_ENV
fi
# build image
- name: Build image
uses: elgohr/[email protected]
with:
name: incore/doc/pyincore
no_push: true

# this will publish to NCSA
- name: Publish to NCSA Hub
#if: github.event_name != 'pull_request' && github.repository == env.MAIN_REPO
if: github.repository == env.MAIN_REPO
uses: elgohr/[email protected]
env:
BRANCH: ${{ env.GITHUB_BRANCH }}
VERSION: ${{ env.VERSION }}
BUILDNUMBER: ${{ github.run_number }}
GITSHA1: ${{ github.sha }}
with:
registry: hub.ncsa.illinois.edu
name: incore/doc/pyincore
username: ${{ secrets.HUB_USERNAME }}
password: ${{ secrets.HUB_PASSWORD }}
tags: "${{ env.TAGS }}"
buildargs: BRANCH,VERSION,BUILDNUMBER,GITSHA1
48 changes: 48 additions & 0 deletions .github/workflows/pytests.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
name: pytests

# Runs unit tests on:
# - any push to any branch
# - when a PR is opened/reopened - this is just for additional safety and covers the case of master -> develop PRs without new pushes

# TODO: Use mamba instead of conda for installing packages. Improves on ~5mins it takes to install requirements.
# https://github.com/mamba-org/provision-with-micromamba

on:
push:

pull_request:
types: [opened, reopened]

jobs:
unit_test:
runs-on: ubuntu-latest
steps:
- name: Checkout source code
uses: actions/checkout@v2

- name: Cache conda
uses: actions/cache@v2
env:
# Increase this value to reset cache if environment.yml has not changed
CACHE_NUMBER: 0
with:
path: ~/conda_pkgs_dir
key:
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-${{
hashFiles('environment.yml') }}

- name: Install miniconda
uses: conda-incubator/setup-miniconda@v2
with:
miniconda-version: "latest"
mamba-version: "*"
use-mamba: true
python-version: 3.8
activate-environment: base
environment-file: environment.yml
use-only-tar-bz2: true

- name: Run tests with pytest
run: |
echo "${{secrets.PYTEST_USER_TOKEN}}" > tests/pyincore/.incorepw
$CONDA/bin/python -m pytest --ignore=tests/test_format.py --ignore=tests/pyincore/analyses
20 changes: 20 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,26 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).

## [1.1.0] - 2021-10-27

### Added
- Convert HUA and PD outputs to JSON [#9](https://github.com/IN-CORE/pyincore/issues/9)
- Convert population dislocation output to heatmap [#3](https://github.com/IN-CORE/pyincore/issues/3)
- Joplin empirical restoration analysis [#28](https://github.com/IN-CORE/pyincore/issues/28)
- GitHub action to run unit tests [#26](https://github.com/IN-CORE/pyincore/issues/26)
- GitHub action to build documentation [#23](https://github.com/IN-CORE/pyincore/issues/23)
- Conda recipe [#17](https://github.com/IN-CORE/pyincore/issues/17)

### Changed
- Percent change in utils converting CGE output to JSON [#34](https://github.com/IN-CORE/pyincore/issues/34)
- Show API response messages that services return [#6](https://github.com/IN-CORE/pyincore/issues/6)
- Removed deprecated methods [#7](https://github.com/IN-CORE/pyincore/issues/7)

### Fixed
- Pass dataset type as parameter to from_dataframe method [#8](https://github.com/IN-CORE/pyincore/issues/8)
- PEP8 styling issues [#20](https://github.com/IN-CORE/pyincore/issues/20)
- Corrections to residential building recovery [#25](https://github.com/IN-CORE/pyincore/issues/25)

## [1.0.0] - 2021-08-31
### Changed
- Improve runtime efficiency of residential recovery analysis [INCORE1-1339](https://opensource.ncsa.illinois.edu/jira/browse/INCORE1-1339)
Expand Down
File renamed without changes.
4 changes: 2 additions & 2 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,9 @@
author = ''

# The short X.Y version
version = '1.0'
version = '1.1'
# The full version, including alpha/beta/rc tags
release = '1.0.0'
release = '1.1.0'

# -- General configuration ---------------------------------------------------

Expand Down
21 changes: 19 additions & 2 deletions docs/source/modules.rst
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ analyses/example
:members:

analyses/housingrecoverysequential
==============================
==================================
.. autoclass:: housingrecoverysequential.housingrecoverysequential.HousingRecoverySequential
:members:

Expand Down Expand Up @@ -89,6 +89,13 @@ analyses/joplincge
.. autofunction:: joplincge.outputfunctions.get_diff
:members:

analyses/joplinempiricalrestoration
===================================
.. autoclass:: joplinempiricalrestoration.joplinempiricalrestoration.JoplinEmpiricalRestoration
:members:
.. autoclass:: joplinempiricalrestoration.joplinempirrestor_util.JoplinEmpirRestorUtil
:members:

analyses/meandamage
===================
.. autoclass:: meandamage.meandamage.MeanDamage
Expand Down Expand Up @@ -126,7 +133,7 @@ analyses/populationdislocation
:members:

analyses/residentialbuildingrecovery
============================
====================================
.. autoclass:: residentialbuildingrecovery.residentialbuildingrecovery.ResidentialBuildingRecovery
:members:

Expand Down Expand Up @@ -252,6 +259,11 @@ utils/analysisutil
.. autoclass:: utils.analysisutil.AnalysisUtil
:members:

utils/cgeoutputprocess
======================
.. autoclass:: utils.cgeoutputprocess.CGEOutputProcess
:members:

utils/dataprocessutil
=====================
.. autoclass:: utils.dataprocessutil.DataProcessUtil
Expand All @@ -272,6 +284,11 @@ utils/geoutil
.. autoclass:: utils.geoutil.GeoUtil
:members:

utils/popdisloutputprocess.py
=============================
.. autoclass:: utils.popdisloutputprocess.PopDislOutputProcess
:members:

utils/networkutil
=================
.. autoclass:: utils.networkutil.NetworkUtil
Expand Down
27 changes: 27 additions & 0 deletions environment.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
name: base
channels:
- conda-forge
- defaults
dependencies:
- boto3
- deprecated
- fiona>=1.8.4
- geopandas>=0.6.1
- ipopt>=3.11
- jsonpickle>=1.1
- networkx>=2.2
- numpy>=1.16.6,<2.0a0
- owslib>=0.17.1
- pandas>=0.24.1
- pycodestyle>=2.6.0
- pyomo>=5.6
- pyproj>=1.9.6
- pytest>=3.9.0
- python-jose>=3.0
- pyyaml>=3.13
- rasterio>=1.0.18
- requests>=2.21.0
- rtree>=0.8.3
- scipy>=1.2.0
- shapely>=1.6.4.post1
- wntr>=0.1.6
10 changes: 3 additions & 7 deletions pyincore/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,15 +20,11 @@
from pyincore.restorationservice import RestorationService
from pyincore.spaceservice import SpaceService
from pyincore.utils.analysisutil import AnalysisUtil
from pyincore.utils.popdisloutputprocess import PopDislOutputProcess
from pyincore.utils.cgeoutputprocess import CGEOutputProcess
from pyincore.dataset import Dataset, InventoryDataset, DamageRatioDataset
from pyincore.models.fragilitycurveset import FragilityCurveSet
from pyincore.models.standardfragilitycurve import StandardFragilityCurve
from pyincore.models.periodbuildingfragilitycurve import PeriodBuildingFragilityCurve
from pyincore.models.periodstandardfragilitycurve import PeriodStandardFragilityCurve
from pyincore.models.customexpressionfragilitycurve import CustomExpressionFragilityCurve
from pyincore.models.conditionalstandardfragilitycurve import ConditionalStandardFragilityCurve
from pyincore.models.parametricfragilitycurve import ParametricFragilityCurve
from pyincore.models.fragilitycurverefactored import FragilityCurveRefactored
from pyincore.models.fragilitycurve import FragilityCurve
from pyincore.models.mappingset import MappingSet
from pyincore.models.mapping import Mapping
from pyincore.networkdata import NetworkData
Expand Down
10 changes: 5 additions & 5 deletions pyincore/analyses/bridgedamage/bridgedamage.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
from pyincore import AnalysisUtil, GeoUtil
from pyincore import BaseAnalysis, HazardService, FragilityService
from pyincore.analyses.bridgedamage.bridgeutil import BridgeUtil
from pyincore.models.fragilitycurverefactored import FragilityCurveRefactored
from pyincore.models.fragilitycurve import FragilityCurve


class BridgeDamage(BaseAnalysis):
Expand Down Expand Up @@ -175,7 +175,7 @@ def bridge_damage_analysis_bulk_input(self, bridges, hazard_type,
dmg_intervals = dict()
selected_fragility_set = fragility_set[bridge["id"]]

if isinstance(selected_fragility_set.fragility_curves[0], FragilityCurveRefactored):
if isinstance(selected_fragility_set.fragility_curves[0], FragilityCurve):
# Supports multiple demand types in same fragility
hazard_val = AnalysisUtil.update_precision_of_lists(hazard_vals[i]["hazardValues"])
input_demand_types = hazard_vals[i]["demands"]
Expand All @@ -190,9 +190,9 @@ def bridge_damage_analysis_bulk_input(self, bridges, hazard_type,
if not AnalysisUtil.do_hazard_values_have_errors(hazard_vals[i]["hazardValues"]):
bridge_args = selected_fragility_set.construct_expression_args_from_inventory(bridge)
dmg_probability = \
selected_fragility_set.calculate_limit_state_refactored_w_conversion(hval_dict,
inventory_type="bridge",
**bridge_args)
selected_fragility_set.calculate_limit_state(hval_dict,
inventory_type="bridge",
**bridge_args)
dmg_intervals = selected_fragility_set.calculate_damage_interval(dmg_probability,
hazard_type=hazard_type,
inventory_type="bridge")
Expand Down
6 changes: 3 additions & 3 deletions pyincore/analyses/buildingdamage/buildingdamage.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
from pyincore import BaseAnalysis, HazardService, \
FragilityService, AnalysisUtil, GeoUtil
from pyincore.analyses.buildingdamage.buildingutil import BuildingUtil
from pyincore.models.fragilitycurverefactored import FragilityCurveRefactored
from pyincore.models.fragilitycurve import FragilityCurve


class BuildingDamage(BaseAnalysis):
Expand Down Expand Up @@ -174,7 +174,7 @@ def building_damage_analysis_bulk_input(self, buildings, retrofit_strategy, haza
building_period = selected_fragility_set.fragility_curves[0].get_building_period(num_stories)

# TODO: Once all fragilities are migrated to new format, we can remove this condition
if isinstance(selected_fragility_set.fragility_curves[0], FragilityCurveRefactored):
if isinstance(selected_fragility_set.fragility_curves[0], FragilityCurve):
# Supports multiple demand types in same fragility
b_haz_vals = AnalysisUtil.update_precision_of_lists(hazard_vals[i]["hazardValues"])
b_demands = hazard_vals[i]["demands"]
Expand All @@ -192,7 +192,7 @@ def building_damage_analysis_bulk_input(self, buildings, retrofit_strategy, haza
if not AnalysisUtil.do_hazard_values_have_errors(hazard_vals[i]["hazardValues"]):
building_args = selected_fragility_set.construct_expression_args_from_inventory(b)

dmg_probability = selected_fragility_set.calculate_limit_state_refactored_w_conversion(
dmg_probability = selected_fragility_set.calculate_limit_state(
hval_dict, **building_args, period=building_period)
dmg_interval = selected_fragility_set.calculate_damage_interval(
dmg_probability, hazard_type=hazard_type, inventory_type="building")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -127,20 +127,21 @@ def cumulative_building_damage(self, eq_building_damage,

limit_states = collections.OrderedDict()

limit_states["LS_0"] = eq_limit_states["LS_0"] + tsunami_limit_states["LS_0"] \
- eq_limit_states["LS_0"] * tsunami_limit_states["LS_0"]

limit_states["LS_1"] = eq_limit_states["LS_1"] + tsunami_limit_states["LS_1"] \
- eq_limit_states["LS_1"] * tsunami_limit_states["LS_1"] \
+ ((eq_limit_states["LS_0"] - eq_limit_states["LS_1"]) * (
tsunami_limit_states["LS_0"] -
tsunami_limit_states["LS_1"]))

limit_states["LS_2"] = eq_limit_states["LS_2"] + tsunami_limit_states["LS_2"] \
- eq_limit_states["LS_2"] * tsunami_limit_states["LS_2"] \
+ ((eq_limit_states["LS_1"] - eq_limit_states["LS_2"]) * (
tsunami_limit_states["LS_1"] -
tsunami_limit_states["LS_2"]))
limit_states["LS_0"] = \
eq_limit_states["LS_0"] + tsunami_limit_states["LS_0"] \
- eq_limit_states["LS_0"] * tsunami_limit_states["LS_0"]

limit_states["LS_1"] = \
eq_limit_states["LS_1"] + tsunami_limit_states["LS_1"] \
- eq_limit_states["LS_1"] * tsunami_limit_states["LS_1"] \
+ ((eq_limit_states["LS_0"]
- eq_limit_states["LS_1"]) * (tsunami_limit_states["LS_0"] - tsunami_limit_states["LS_1"]))

limit_states["LS_2"] = \
eq_limit_states["LS_2"] + tsunami_limit_states["LS_2"] \
- eq_limit_states["LS_2"] * tsunami_limit_states["LS_2"] \
+ ((eq_limit_states["LS_1"]
- eq_limit_states["LS_2"]) * (tsunami_limit_states["LS_1"] - tsunami_limit_states["LS_2"]))

damage_state = FragilityCurveSet._3ls_to_4ds(limit_states)

Expand Down
Loading

0 comments on commit b0b0675

Please sign in to comment.