Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
tr7200 authored Apr 19, 2021
1 parent b28a1a6 commit e37ea7c
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,10 @@ Why? Considering the intrinsic properties of the causal Bayesian neural network,

Extrinsically, the neural network provides a prediction interval with upper and lower bounds that provides an estimate of the amount of uncertainty in prediction. This research shows that removing the SEM node with the weakest causal connection to firm performance speeds up the time to finding that estimate when uncertainty is added to the neural network using the Kullback-Leibler divergence. This opens up the possibility of estimating SEM in this manner after each epoch in a manner similar to [Dropout](https://patents.google.com/patent/US9406017B2/en) in neural networks, which might be considered 'prior art' to this technique. However, this would apply to nodes/edges instead:

<p align="center">
<!-- <p align="center">
<img width="400" height="300" src=https://mir-s3-cdn-cf.behance.net/projects/404/11278773.54812a20214b7.jpg>
</p>
</p> --!>

*Image credit Behance.net, creative commons license*
<!-- *Image credit Behance.net, creative commons license* --!>

This code uses version 0.7.0 of Tensorflow-probability and Scikit-Learn to organize the experiments.

0 comments on commit e37ea7c

Please sign in to comment.