- It seems to be as I assumed that when "high value" vertices are missing there are to many alternatives which have to be excluded in the iteration process.

- I should create tables with the differences in characteristics of those gridlike graphs and random graphs.

- On "thin" random graphs (width << length) the algorithm performs much better as there are not so many alterantive I assume.

- A comparison between ILP and ASP hasn't been done yet but is important to see, if this implemantation is at least competetive on random graphs! ❌

- Measure and note gap between |D| of a minimal connected and a minimal unconnected solution. Measure for each |D| number of unconnected solutions which were found(ILP-only) and how many constraints where added in sum.

- Measure the time which was needed to find the first (nearly) optimal solution (strong upper bound) and the time needed to close the gap.

- - Constraints ✔

- - Number of unconnected solutions is equal to lazily added constraints ✔

- - It would still be interesting to measure the number of solutions which exist for each size. Completely ommiting any connectivity. ❌

- Measure the time which was needed to find the first (nearly) optimal solution (strong upper bound) and the time needed to close the gap. ✔

* Make some more detailed tests and check for the following connections:

- To what degree does the number of unconnected k-hop solutions which have at most as many nodes as an optimal connected solution affect the runtime? (Calculate how many of them exist and add this data as a column to a table of test results)

- Is there a clear corellation between the number of constraints which were added and the runtime for (possible unconnected) solutions?

...

...

@@ -63,8 +67,12 @@

* Read _Imposing Connectivity Constraints in Forest Planning Models_. ✔

- Check for different constraints which could strengthen the formulation. (with focus on symmetry breaking)

- They did not mention a symmetry breaker *but* they mentioned some other types of inequalities which could strengthen the formulation according to the connectivity specification.

- The two most promising seem to be

- Only adding separators involving the root node but not between connected components

- The rooted ring inequalities

- A thing which was interesting that for their specific problems (which were not really close to MCDS and had only connectivity in common) they achived a much stronger LP bound.

- They added cuts before an ILP solution was found and added cuts for LP solutions also.

- They used different subproblems which they solved iteratively and used the previous results as a heuristic for the next iteration step.

* Read _An Efficient Branch and Cut Algorithm to Find Frequently Mutated Subnetworks in Cancer_ again with focus on symmetry breaking.

* Read through _An Integer Programming Approach for Fault-Tolerant Connected Dominating Sets*_ again and check for symmetry breaking or other constraints to tighten up the space of solutions. ✔

- I could not find anything about symmetry breaking or additional inequalities for the case k=d=1 (Which is standard MCDS). But the table of results was interesting because they also tested their implemenation for the case k=d=1 which then is equal to our ILP-formulation. Their results were not bad but unfortunately I could not find anything more detailed description of their test graphs. Only number of nodes and density is shown. But those two properties are not sufficient as my personal test on random graphs revealed.