Commit f1dd6676 authored by msurl's avatar msurl
Browse files

added week 1 journal

parent 45097e9e
# Week 1
## Python
### Implementation
* I implemented an ILP Model to solve K-hop-dominating set using min. vertex separators.
* I refactored the implementation using an object oriented approach
### packaging
* I fetched a cookiecutter template to create a conda package
* One can build this package via `conda build` and install it locally. It contains a cli. So one can use the package as cli-tool inside his/her conda environment and use optional parameters to specify the input and different parameters
* There is still stuff to do to deliver a clean package
### runtime
* the first test has shown a horrible runtime. I've sticked to an ILP implementation similar to the proposed in _An Efficient Branch and Cut Algorithm to Find Frequently Mutated Subnetworks in Cancer_
* the middle leaf.lp needed 7 hours to be solved on my laptop (which is not just a better toaster)
* I tried developed different additional constraints which include the length of the shortest path between the root node and nodes inside the DS. This did not improve the runtime.
* I then added an additional constraint mentioned in _Thinning out Steiner trees: a node-based model for uniform edge costs_. This did significantly improve the runtime. Middle leaf.pl only needed 45 seconds. BUT unfortunately this constraint can not be applied for our problem and it increases the number of nodes included in a solution.
* I did screenshots from different runs to document the runtime
## Literature
* I read parts of the literature added to the repo and tried to figure out if there are other inequalities which define the connectivity which may be applyable to the problem. Or if there are other techniques which can improve the runtime but still use the vertex separators as connectivity defining inequalities.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment