All examples are made to work in both 32 and 64 bit versions of the API. As vba lack a propper terminal, progress and results are instead written to a worksheet named “output”. Note that writing anything other than a number in cell 1, 1 of this document may break some of the examples.
About | All examples are made to work in both 32 and 64 bit versions of the API. |
Load And Propagate | This example is used to load a Bayesian network or a LIMID. |
Build And Propagate | This example describes how a Bayesian network can be constructed using the HUGIN vba API. |
Net Construction Sample | This example shows a way to build a simple network with the vba API. |
Node Naming Scheme for oobn | When creating a runtime domain from a Class, all nodes are named by concatenating the names of the nodes in the list of source nodes(for source nodes see GetSource()) using a dot character(‘.’ |
Sequential Learning | This example presents a skeleton for sequential learning. |
This example is used to load a Bayesian network or a LIMID. Once this has been done, the corresponding domain is triangulated using the minimum fill-in-weight heuristic and the compilation process is completed. The beliefs and utilities are then printed and finally, a HUGIN case file is entered, propagation of evidence is performed and the new beliefs and utilities are printed.
Note that this program loads networks saved in the .net format. The HUGIN GUI application saves in oobn format bydefault.
This example describes how a Bayesian network can be constructed using the HUGIN vba API. The Bayesian network constructed consists of three numbered nodes. Two of the nodes take on values 0, 1 and 2 and the third node takes on the value of the sum of the other two nodes. Once the Bayesian network is constructed, the network is saved to a .net specification file and an initial propagaton is performed. Finally, the marginals of the nodes are printed.
When creating a runtime domain from a Class, all nodes are named by concatenating the names of the nodes in the list of source nodes(for source nodes see GetSource()) using a dot character(‘.’) as seperator. This naming scheme makes it easy to work with the nodes in the runtime domain, as one can use the dot naming convention when calling GetNodeByName(String).
This example presents a skeleton for sequential learning. Sequential learning, or adaptation, is an update process applied to the conditional probability tables. After a network has been built, sequential learning can be applied during operation in order to maintain the correspondence between the model (conditional probability tables) and the real-world domain.
After the network is loaded in HUGIN, the learning parameters are specified. Then follows the build-up and entering of cases, and finally, the tables are updated and node marginals are printed.