Inspired by Jack Edmonds and my Professor Dr. Rainer Schrader I decided to start doing research on intersections of 3 matroids. Matroids are really cool. Motivations come from Linear Algebra but if you want to keep it simple, you could just define it as follows:

**Definition**: A *Matroid* is a pair where is a finite set, and the following constraints hold:

- is not empty
- For with there is a such that

Sets in are called *independent*.

If we have got weights on the Elements and if the weight of a subset of E is defined as the sum of the weights of all its elements, we can use the greedy algorithm to find an optimum independent set in polynomial time. Of course we would therefor need a method which allows us to check whether a set is independent or not in polynomial time.

So what is the intersection of matroids?

**Definition**: Let be matroids. The Intersection of the n matroids is defined as , which is obviously no matroid but at least it is an independence system.

There is an algorithm for finding an optimum set that is independent in two matroids in polynomial time. However you can express the Hamiltonian path problem as intersection of three matroids so in general finding an optimum set that is independent in three matroids is NP-hard.

The question now is: Is it generally NP-hard to find an optimum independent set in the non-trivial intersection of more than two matroids? If no, can you characterize the matroids that carry the problems? If yes, you would have a good method of proving the NP-hardness of a problem.

As far as I know, there is no problem that can be expressed as a non-trivial intersection of three matroids that is not NP-hard.

For my research I will start with the colored branching problem: Given a colored graph with weights on its edges, find an optimum branching that contains k blue edges.

This problem can obviously be expressed as intersection of three matroids but does not seems to be very difficult. Maybe I can find a polynomial time algorithm for solving the problem.